Hi there!
I am trying to get my head around a solution I am trying to develop for a client.
They have raw data in CSVs with 200-300 columns and about 4-5k rows. Out of this uploaded dataset, information would need to be “decoded” since (case A) the same column will, e.g., only contain a non-readable numerical code which needs translating into text and (case B) one column might contain a list of codes which require decoding and (case C) several columns need simple calculations of basic scores to generate a combined score.
It seems to me the appropriate approach might be uploading the raw data into a separate Data Type aptly called Upload, and then have a recurring backend process check that Data Type for new lines, convert the new lines into the Data Types that are going to be read to create simple charts (have a great plugin there, already), and - once the line has been coded and written to the live Data Types - deleted from the Upload Data Type.
Am I on the right track? Which type of operation would my workflow likely use to convert from numerical code to clear text?
My guess for case A was to create Option Sets (those codes do not really change) with codes and text and then have the conversion workflow search the Option Set for code A to return clear text A.
Case B is similar, just converting each item of each list.
Case C would be a workflow that simply calculates and writes the result to the live Data Type.
Is this an efficient approach or would you suggest a different route?
Thank you for your help!
Kind regards