How TIM Automates AI
We recently found the image below going around online which paints a clear and accurate picture of what working with AI entails:
Although all blocks are valid, it doesn’t mean a lot of time and effort is required to cover these. For your time series analysis with AI the TIM platform automates and supports most of these activities and in this post we will outline exactly how.
Time series analysis is a challenging task with a unique set of problems inherent to the type of data. While many modeling techniques already exist for other applications that could also be applied on time series data, this field requires a specific solution that is tailored to the specific problems of time series analysis. For this we have created TIM (Tangent Information Modeler).
Exploring the data is key before time series analysis. Visualizing the data will help you better understand the scope of the task, validate the quality of the data and help you determine the best path to a useful predictive model.
For this we have developed the TIM Studio. It is a user-friendly web application that provides our users with an intuitive UI to interact with TIM.
Find out more about TIM Studio.
Data cleaning is always going to be a necessary step before data analysis. TIM does not cover all aspects of data cleaning however the following transformations are taken care of automatically by TIM.
TIM will automatically recognize outliers in the data, take those into account in the model building process and notify you that outliers have been found in the data.
- Gaps and missing data:
You can present TIM with small or large gaps in the data and it will automatically impute the missing values. TIM can do so in several different ways.
- Timescale and aggregation:
Specifically for time series analysis, determining the right time scale and then aggregating the data accordingly is a task that often occurs. Therefore we have developed this as a built-in functionality.
Applying different filters across different categories within the data to create specific models for each category is also a built-in functionality.
Find out more:
- about imputation,
- about timescale and aggregation,
- about filters.
Normalization is often a necessary step in data analysis to assist the modeling technique with finding the useful patterns in the data. For TIM this is not a separate step in the data engineering process but rather a built-in automated feature which the user doesn’t need to worry about at all.
Find out more about normalization.
Feature Engineering is the most crucial step in time series analysis. Once you have your data ready for analysis, you need to determine which patterns are actually present in the data that have predictive value. With manual feature engineering this will often lead to hit-and-miss results that will not capture the actual signals in the data. Another approach would be to use brute computing power to go over all possible features which is usually resource heavy, impractical and therefore costly.
The best approach is to use the automated feature engineering capabilities of TIM, which uses Instant Machine Learning to quickly and automatically find the relevant features within the data. InstantML uniquely uses a niche in mathematics called Information Geometry, which allows TIM to plot the data in a multi-dimensional space and create geometric structures that can be used to find the best features in a matter of seconds with limited computing resources required.
The result is that you automatically get a list of complex features covering all aspects of time series analysis which are automatically combined into a useful predictive model.
The TIM platform is designed to handling time series data specifically. To manage all of this data efficiently we have created the TIM DB which is a lightweight data version control system.
Find out more:
TIM takes a different approach from other modeling techniques and requires no model selection. TIM always uses InstantML to build 1 specific time series model per scenario and therefore never needs to compare and score multiple models. This is one of the reasons TIM is significantly faster in getting to results. In short, this complex step is no longer required with TIM.
Model training is done fully automatically by TIM. Because TIM uses InstantML the training time is very limited and models can be built in a matter of seconds. All the relevant features found in the feature engineering step are combined together into GAM models (Generalized Additive Model) relying on geometrical perspectives and incorporating a tweaked Bayesian Information Criterion.
Find out more about Model building in TIM.
Accuracy metrics and error measures are automatically provided by TIM to validate the quality of the model. There are also processes that you can rely on within TIM that assist you in monitoring the quality over time.
Find out more about Error measures in TIM.
Model tuning is another step that is no longer required with TIM. With InstantML there are no hyperparameters that need to be selected and compared which can required computing resources and time. InstantML is deterministic which means that for the same dataset combined with the same configuration of TIM, the same result will always come out.
Most of the time, default settings of TIM usually finds most predictive value in the data. You do, however, have the option to select or deselect certain functionalities of TIM to tweak the model to your specific needs and use case using the different configuration settings.
Find out more about Configuration in TIM.
Value & Operationalizing
The real value from AI is only obtained once the models are actually placed into production. To avoid getting stuck in the experimentation phase you need to focus on the MLOps which is especially hard for time series use cases. For this you will need a platform that allows you to manage all aspects of MLOps intuitively.
Find out more about the TIM platform.
The TIM platform automatically registers any model built and allows you to organize yourself in an intuitive workflow.
Find out more about Navigating TIM Studio.
There are two aspects of deployment with TIM: firstly, TIM is a containerized solution which can be deployed within different cloud environments, on-premises and even on edge devices. Secondly, the models themselves are stored within the TIM platform. The platform is wrapped in an API which allows you to easily access your models, data and results. You only need to integrate with 1 API to access all your models which significantly simplifies this aspect of MLOps.
Find out more:
- about Deployment,
- about TIM Version 5.
TIM has built-in functionalities to monitor the quality of the models and predictions over time. Within the TIM Studio you will find an intuitive visualization of your experiments with TIM.
Find out more:
- about Monitoring,
- about Operations View.
Here TIM again has another approach compared to other modeling techniques. Because of the speed of InstantML, you can retrain a model every time you require a new prediction. This allows you to remain adaptable to changing circumstances and structural changes in the data. This process is easily automated with TIM and is therefore from our perspective only a minor step in A.I.
We have a complete analysis comparing single model training with continuous retraining which you can find on our GitHub Page. Here we outline the advantages of remaining flexible to a changing data environment.
Find our more here: Tangent-Works/Time-For-Time-Series (github.com)
AI for businesses is not straightforward at all. The real path to value does not come from reinventing the wheel every time you want to tackle a new use case but rather gathering the right tools to assist you throughout the process.
For time series analysis, TIM is the right tool to help you with all the complexities that you will be faced with when working in this field.
Learn more about TIM InstantML →