I'm building a service where I need to filter activities from different products. The actual data itself lies in another key value store. As the service needs to support filtering on the activities as well was planning to use mysql for the filtering usecase alone. As the activities from different products can be different and the fields that need to indexed can also be different I was thinking about creating new tables for each set of activities with the columns the need to be indexed. Do you see any problems in creating tables on fly even for these types of usecases?
Since the service tries to be generic it does not have structure of the data before hand. I'm imagining the schema for each type of activity would be something like this
__________________________________________________________________________ tenant_id| activity_id | filter_1(actor_id) | filter_2(group_id)| filter_3(segment_id) __________________________________________________________________________ primary key(tenant_id, activity_id) (activity_id is a timebased id to sort activities based on time) index_1 : tenant_id| filter_1 | activity_id index_2 : tenant_id| filter_2 | activity_id index_3 : tenant_id| filter_2 | activity_id if the activity usecase queries certain fields together then always then we might replace the indexes with a composite index(Eg. filter_2 and filter_3 are always present in the queries) index_4: tenant_id| filter_2 |filter_3 | activity_id
I'm also worried about the number of indexes that we might end up creating with this approach. Is it ok have so many indexes?