Best practice architecture for speed - quick question

Hi all,

With our portal, we currently have 5 data types. This is all working fine.

However, I wonder what best practise is in terms of speed. E.g. Are we better off having more data types, with each data type row having less columns/cells. OR, are we better off having less data types and having a data type row with more colums/cells?

I’m concerned about speed due to one of our data types has over 25 + columns per row.

I am far from a DB guy, so hoping for some expert advice here.


I’d prefer to have data types as minimum as possible.

Saving/delete records in more than 1 datatype in a single wf will reduce speed significantly


It depends. I tend to have more data types, 20-30, but I agree with @anwarsby that the workflow parts is important as it slows down if a workflow has to create/edit multiple data types.

You also have to think about the data that you search for. If you do broad searches that return many items, you will see a decrease in speed with 25 columns as bubble sends the whole item to the browser.

1 Like

I seem to remember that @vincent56 from Ideable has some data about how structure choice affects performance.

Is that right Vincent?

I don’t have experience with quite lot of fields in data type, mine are only 10-15 fields each.

In some data types i even only create a field (sequential number) to be used as common id for some group of records at which in other scenario it should be 2 data types in parent child relation. So I use this seq number as constraint in data source (rg, dd, group etc). And so far is ok.

What I do not know yet is when my data grows bigger and bigger, if I can keep the speed by using this seq number id as constraint for datasources, rather than using 2 data type in parent child relation structure.