As illustrated across our website, time series are everywhere. Each industry vertical, each domain, each company comes into contact with time series in one way or another. In the use case library below, you can explore the extensive applications of time series, or look for applications in your specific field of interest.
Michal Bezak - Tangent Works
Contact centers typically operate with a pool of resources. For contact centers, predicting volume of incoming requests at specific times is critical to make proper resource scheduling. In such case, forecasts are expected for very short, and short-term, horizon (e.g. a week ahead).
High-quality short-term forecast brings confidence that FTEs (full time equivalent) planned for the next week are just right for delivering on SLAs, not to mention other benefits, such as higher confidence when planning absence, or improving morale of employees who would not face overload from “sudden” volume peaks.
Predicting volume of requests for mid-term horizons, e.g. 3-months ahead for weekly data, is important input to resource management. It takes time (weeks if not longer) moving people around, hiring, upskilling, or down-sizing pool of resources. Because of this, forecasts for longer horizon are needed, starting from one to more months.
The picture below depicts how contact centers are typically linked to WFM (work force management), internal departments and other factors. This provides intuition which factors should be included in data used for building models and forecasting.
Big contact centers would support not one, but multiple regions, cultures, and languages. Very likely forecast by language, or country would be beneficial.
Having forecast for just one perspective, one or two prediction horizons is not sufficient, also dynamics of ever-changing business means that using model built one month ago is suboptimal. This means that having capability to build models and make new predictions instantly is necessary for successful management of resources.
TIM can forecast for any prediction horizon, from intra-day to days or weeks ahead, and thus can be used for short term, midterm as well as long term forecasting. It can build new model in faction of time, building models from the latest data which helps to achieve better accuracy.
It’s automated, thus your analysts and data scientists have free capacity to focus on other agenda. You’d gain new capability, more frequent forecasting or forecasting per various perspectives with minimal additional effort is possible.
Effort and time required to set up forecasting is reduced to fraction of what would be typically required. TIM, by design, automates most of the steps required for set up and operations, and offers robust ML solution capable to quickly adapt to structural changes.
High-quality forecasts would deliver following benefits:
Explanatory variables should include historical actual volume values, meteo predictors, holiday information, marketing activity (campaign) information, factors describing customer base, planned outages, and/or other relevant data with as low latency as possible.
TIM’s output consists of forecasted volume of requests per each hour/day/week in selected prediction horizon.
Michal Bezak - Tangent Works
Concrete is one of the most important building materials used in construction. It is a composite of fine and coarse aggregates bonded together with a fluid cement that hardens over time to form a solid mass.
Ensuring the strength, quality and durability of concrete are critical for the stability of buildings. Because of this, concrete must pass quality control to check various parameters.
Compressive strength is one of the parameters used to evaluate the quality. In short, compressive strength is the capacity to withstand loads. It is measured on a universal load testing machine that requires several samples.
With the capability to calculate compressive strength by Machine Learning, the testing process can be optimized.
TIM can build ML models automatically using historical data. The very principle – model, used for calculating future values (forecast) – can also be utilised to evaluate qualitative parameters.
Explanatory variables should include volumes of inputs such as cement, ash, water, fine/coarse aggregate.
TIM’s output consists of a value in MPa calculated for a given input.
Michal Bezak - Tangent Works
One of the factors that have an impact on battery health and capacity is temperature. With the rising temperature, there is more capacity available for discharge and vice versa. Temperature is an important factor also during battery charging. To maximize the lifespan of Li-ion batteries, they should not be charged below 0°C.
Nowadays, advanced battery systems rely on cooling and heating mechanisms that help batteries operate efficiently (and keep them healthy) even in extreme conditions.
Knowing when to take action to prevent over-heating means getting an accurate forecast at first. This can be the case for one battery only or multiple batteries installed on a grid.
With various deployment options, incl. on the edge, TIM can to be used for various industry-specific device options (e.g., in EV).
TIM can build ML models from time-series data and predict temperature in tens of seconds or minutes. Data from device sensors are often sampled in seconds or even milliseconds basis. TIM can work with data sampled with any sampling rate starting from milliseconds.
Models built for each battery regularly can be incredibly beneficial, especially when you consider factors specific to each battery. Batteries are known for their degradation with each charge/discharge cycle; thus model built for a new battery may not be relevant for an older battery. Moreover, conditions in which batteries are operated (e.g. ambient temperatures) also differ. Profile of discharge reflecting usage is another dynamic factor.
Explanatory variables should include measurement from relevant sensors such as voltage, current, temperature, external conditions and others.
TIM’s output consists of forecasted temperature values.
Michal Bezak - Tangent Works
The accelerating adaption of electric vehicles (EVs) is driving improvements of battery technologies at unprecedented speed. Bigger capacities, faster charging, and a longer lifespan of batteries are in focus.
Despite the progress, batteries’ capacity still implies constraints on how we use them, and until there is substantial progress, information about how much time is left till complete discharge is particularly important.
Knowing how much time is left helps us plan subsequent actions such as an optimal route when to charge, how much additional load can be used etc.
TIM allows for various deployment methods (from edge to cloud). TIM can be deployed inside the device (e.g. inside an electric vehicle), or in the cloud to which the battery grid is connected.
TIM can build ML models from time-series data and predict temperature in tens of seconds or minutes. Data from device sensors are often sampled in seconds or even milliseconds basis. TIM can work with data sampled with any sampling rate starting from milliseconds.
Models built for each battery regularly can be incredibly beneficial, especially when you consider factors specific to each battery. Batteries are known for their degradation with each charge/discharge cycle; thus model built for a new battery may not be relevant for an older battery. Moreover, conditions in which batteries are operated (e.g. ambient temperatures) also differ. Profile of discharge reflecting usage is another dynamic factor.
Explanatory variables should include measurement from relevant sensors such as voltage, current, temperature, external conditions and others.
TIM’s output consists of forecasted temperature values.
Michal Bezak - Tangent Works
Every year billions of transactions are made by payment cards worldwide. Card companies spend vast amounts of resources to keep card operations fast and secure. Fraudulent activity related to misuse of cards can relate to both debit and credit cards. Costs incurred due to credit cards fraud can go as high as tens of billions of dollars annually.
This is a broad topic, securing card operations does not stop only at protecting data trying to avoid data breaches. Card issuers, banks and merchants need to take countermeasures to combat card payment fraud. Considering vast volumes and velocity, it would not be possible without automation, and AI/ML comes as natural choice.
TIM’s RTInstantML technology builds ML models in automated fashion in fraction of time. Its capabilities cover use cases for time series forecasting, classification and anomaly detection. Detecting fraudulent activity is a task for classification and/or anomaly detection.
Due to its hyper automation and speed, (re)building new ML model every hour, couple of minutes, or on demand for specific transactions is fully possible.
Yet, from operations perspective, TIM can be deployed rapidly fast, and is easy to operate. It can run in cloud or on the edge, scales automatically, is robust enough to withstand defects in data and comes with support of various sampling rates.
In classification cases for detection of fraudulent activity, it is necessary to provide labelled data, i.e., to include flag indicating to which class given activity belongs to (1 for fraudulent, or 0 otherwise).
Explanatory variables would typically include: amount, geo location information, time parameters, effective credit limit, descriptors of previous transaction, channel etc. In general, there are additional predictors used by banks/card companies that (improve accuracy and) are rather kept undisclosed to not give any hints to fraudsters.
TIM’s output in classification tasks is a value ranging from 0 to 1, closer to 1, the bigger probability activity is fraudulent.
Michal Bezak - Tangent Works
Companies across a variety of industries rely on machines: pumps, engines, elevators, turbines, etc. Some are more complex than others, but they surely have one thing in common – degradation of material. With each cycle (moment) of operation, components are losing their original physical parameters. Regular checks, diagnostics, and maintenance, or even replacement is an important part of machine operations.
The ideal scenario is to avoid failure of a given machine, thus being pro-active rather then reactive is for many businesses the only option. Also, acting at the right time has real financial implications. Imagine two extreme situations:
Predictive maintenance solutions can provide the optimal time for maintenance. Thanks to the data coming from sensors and AI/ML, it is possible to get advice, almost in real-time, on what is the best time to take action.
TIM can build automated ML models from time series data and predict the time remaining (Remaining Useful Life, RUL) or classify whether the device is already in a window (zone) of possible failure within a certain period of time (cycles).
Data from machine sensors are often sampled in seconds, or even milliseconds. TIM can work with data sampled in any sampling rate starting from milliseconds.
Also, effort and time required to set up TIM for production use is reduced to a fraction of what would be typically required. TIM, by design, automates most of the steps required for set-up and operations, and offers a robust ML solution.
Input: explanatory variables should include measurements from relevant sensors, values of key settings, information about failures, cycle numbers and/or other.
Output: TIM’s output consists of forecasted RUL value or binary classification (1 or 0), depending on given scenario.
Michal Bezak - Tangent Works
Nowadays, trading is mostly automated, and when there is an order placed you bet it is likely a robot that hit the trigger. AI (robots) took over, it has been years since this trend started.
To build a profitable and sustainable trading system many elements are needed, from risk management, collection of the right data, back-testing etc. There is plethora of areas that can be solved with AI/ML tools, and they can be framed as problems for: forecasting, classification, or anomaly detection. All of them are problems that TIM can solve.
TIM is robust and fast. It can work with data sampled starting from milliseconds to years, data that contain gaps, irregularly sampled data (just like tick data are). It can also build new ML model in truly short time, even with each forecast (or classification).
Effort and time required to set up pipeline with TIM is reduced to fraction of what would be typically required. TIM, by design, automates most of the steps required for set up and operations, and offers robust ML solution capable to quickly adapt to structural changes.
Depending on the problem being solved, prediction horizon, and market, different data may be required. Knowing which data to use is typically part of well protected intellectual property.
If we take short term forecasting, having market (bars) data combined with technical indicators, correlated, or cointegrated assets would be a good start. In a game played with leveraged positions chasing tiniest deltas (movements) high quality data make difference.
TIM’s output consists of forecasted value per step (sample) over desired forecasting horizon.
Michal Bezak - Tangent Works
Metro is one of the most important means of public transport across the globe. It cuts travelling time for millions of people every day, and so its availability is critical.
Metro operations require precise management and forecasting systems. Making accurate forecasts about volume of passengers travelling on concrete lines on certain day (and time) supports decisions about timely and right-sized dispatch of resources – the right amount of carriages prepared with the right number of personnel etc.
TIM is able to forecast practically for any prediction horizon, spanning from intra-day to days or weeks ahead. Effort and time required to set up forecasting is reduced to fraction of what would be typically required. TIM, by design, automates most of the steps required for set up and operations, and offers robust ML solution capable to quickly adapt to structural changes.
Useful data besides historical actual values should also include weather and holiday information. Adding (traffic) data about adjacent connection points could improve accuracy even further.
TIM’s output consists of forecasted volumes over desired forecasting horizon per each hour, 15-min, 5-min. etc. depending on the sampling of your data.
Michal Bezak - Tangent Works
Smart traffic solutions are becoming increasingly important and play a vital role in making our cities (and infrastructure) smarter. They comprise of multiple parts, spanning from hardware, software, and in recent years also AI/ML.
With prediction of traffic (and potential congestion) it is possible to better optimize routes taken thus cut time necessary to transport goods, people etc. Value derived from such capability can be measured with proxy indicators such as avoidance of (wasted) time spent in traffic jams etc.
TIM can forecast practically for any prediction horizon, from intra-day to days or weeks ahead. Effort and time required to set up forecasting is reduced to fraction of what would be typically required. TIM, by design, automates most of the steps required for set up and operations, and offers robust ML solution capable to quickly adapt to structural changes.
Explanatory variables can include besides historical actual values for traffic at given point, also weather, and holiday information. Adding (traffic) data about adjacent connection points could improve accuracy even further.
TIM’s output consists of forecasted traffic over desired forecasting horizon per each hour, 15-min, 5-min. etc. depending on the sampling of your data.
How to prepare your retail and online production, inventory and distribution for changing COVID-19 measures?
External factors can have a huge impact on your demand forecast for specific products in specific locations and channels. The constantly changing government guidance during the current COVID pandemic can also cause huge swings in demand, especially when wide reaching regulation – such as decisions to close restaurants or limit movement – can come and go within hours for entire towns, cities or countries. To help businesses dynamically allocate their scarce resources of staff and inventory, the use of TIM and real-time instant ML can create forecasts that are as dynamic as the events that influence them. This can enable fast business decisions to take advantage of opportunities and limit the costs of reacting to current events.
Talk to our specialist about adaptive retail sales forecasting
The TIM Engine is perfectly suited for this application, because of the speed, resilience and ease of deployment. Other forecasting methods, such as statistical forecasting, are far too slow to react to the modern business climate. Single variate models will miss on the complex interaction between seasonal changes, externally driven changes to demand and externally driven changes to mobility. The speed of forecasting is also incredibly important in determining what factors are causing permanent changes to demand and customer behavior and which changes will be temporary.
Using the TIM Engine, an analyst can quickly iterate on models using dozens or even hundreds of features to get predicted impacts on demand in near real-time. This is a must-have tool for anyone in an organization who is responsible for planning of inventory and/or staffing levels.
As an example, using the TIM product we can immediately predict the impact a new restriction in a specific town would have on online ordering in specific postcodes. This can be used to ensure the correct allocation of capital equipment (trucks), staffing (drivers) and product (warehouses) even in advance of the restriction being implemented. With the TIM Engine, this forecast can be available in minutes to respond to significant events that may be happening within only a few days. We can see that a new announcement of a restriction to restaurants causes an immediate surge in online grocery demand that lasts for several days before subsiding to pre-restriction levels. This analysis can be extended to review the impact on specific products at a SKU level – including the ability to run independent demand forecasts for individual SKUs at individual stores in seconds.
Demand Data has real-time access to external data of significance, such as weather data (forecast and history) for any geographic point, as well as real-time COVID cases, mobility and restrictions for each postcode in most major countries. This data can instantly be prepared as inputs to be combined with sales data for specific products, channels or store locations. We have templates available which can plug into sales data at a store level and create the base models instantly (including COVID, weather and human mobility). From there, your analysts can iterate using other data or assumptions they have and get feedback in seconds on which assumptions or data are good predictors and which ones are not.
Talk to our specialist about adaptive retail sales forecasting
Watch a video taking you through this use case below:
Philip Wauters - Tangent Works
Wind turbines have become progressively more influential as the share of energy production and the infiltration of wind energy into power systems are steadily increasing. With this, the need for reliability in the production capacity of wind turbines has increased as well. The turbines must operate as smoothly as possible, since the unscheduled stoppage of these turbines can lead to significant production losses. In this use case the importance of operations and predictive maintenance are highlighted, and especially the role of health monitoring. Continuous monitoring of wind turbine health using anomaly detection improves turbine reliability and efficiency thus reducing maintenance and wind power costs. Finally, it allows for the optimal timing of turbine maintenance and repairs so that they can reduce the impact on the overall energy production and avoid catastrophic failure of the turbines.
Due to the highly automated, exceptionally fast and reliable modeling algorithm, TIM can build multiple anomaly detection models in a limited amount of time. It is especially useful in this case, since wind turbines often operate in wind farms where multiple turbines need to be monitored simultaneously. The speed and frequency of model building that TIM is capable of also allows for real time notifications of suspicious behavior in any turbine.
Building a model for the detection of anomalous behavior in wind turbines requires a set of training data with several variables. The power output of a wind turbine is dependent on the efficiency of the blades, gear assembly, alternator/dynamo, as well as wind speed, wind direction and wind consistency. Also, the taller the wind turbine, the greater the energy produced, since wind speeds are greater at higher altitudes. With these variables set up in a time series format, TIM can use its anomaly detection capabilities to determine whether or not a power output observation is abnormal.
Are you interested in a walk-through scenario of this type of use case? Then take a look at our solution template on this use case! You can find it under Wind Turbine.
Elke Van Santvliet - Tangent Works
This use case looks at heat consumption, more specifically through water heating. Typical domestic uses of hot water include cooking, bathing and space heating. This heat transfer process is associated with significant costs, thus ensuring energy efficiency is important. This illustrates the need for continuous monitoring of heating system health by closely watching whether the measured heat consumption is appropriate under given circumstances. Anomalous values might indicate underlying issues, such as a ruptured pipe, loss of system pressure, water being stolen or issues with a radiator or boiler. Accurate detection of these issues allows for well aimed, timely inspections.
TIM’s ability to generate explainable models proves its value in this use case as it enables users to understand which factors influence the target variable, heat consumption. Understanding what should be happening is a first step towards figuring out why this might sometimes not be the case. Accurate models can help to detect anomalies early on, which in turn can be crucial in preventing damage and costs. For example, the ability to timely detect and fix a leaking pipe might help in preventing a ruptured pipe. Although some anomalies might be fairly obvious to the trained eye (ex.g. a sudden fall out of (a part of) consumption might indicate a broken meter), others might be more subtle (ex.g. someone stealing a part of the supply by draining some of the water from a pipe). TIM manages to detect both observations that are anomalous in relation to historical values and observations that are anomalous in relation to current circumstances (predictors).
Creating a model that can detect anomalous heat consumption, requires a set of training data. This training data typically consists of past values of the heat consumption, as well as other available variables that play a role in heat consumption. Such variables can be found in meteorological data (outside temperature, wind speed, wind direction…) as well as metered system data (incoming and outgoing water flow).
TIM then uses this data to determine each observation’s anomaly indicator, indicating how anomalous that observation is. This anomaly indicator in turn determines whether or not the threshold is crossed and the observation can be considered anomalous.
Are you interested in a walk-through scenario of this type of use case? Then take a look at our solution template on this use case! You can find it under Heat Consumption.
Elke Van Santvliet - Tangent Works
Industry, companies, cities, households… all consume energy. Whether opting for electricity, gas or thermal power – or, more likely, a combination of them – the need for energy is all around us. Both consumers and producers can benefit greatly from accurate estimates of future consumption, not in the least because extreme volatility of wholesale prices force market parties to hedge against volume risk and price risk. Handling on incorrect volume estimates is often expensive, but accurate estimates tend to require the work of data scientists. This leads to the next challenge, since data scientists are hard to find and hard to keep. The ability to accurately forecast future energy consumption is a determining factor of the financial performance of market players. Therefore, the forecasts are also a key input of the decision making process.
The value of Machine Learning in this use case is clear, but has to be weighed against the costs and efforts it introduces. To achieve accurate forecasts, relevant predictors should be used. TIM automates model generation of accurate forecasting models, and tells you which input variables have the highest relevance in calculating the forecasts. Contrary to data scientists, TIM creates these models in seconds rather than days, or even weeks. The scalability of TIM’s model generation process allows for hundreds of models to be generated at the same time. This allows valuable data scientists to focus on the areas where their expertise matters most.
Let’s put this in numbers. Looking at a rough estimate of savings from a 1% reduction in the MAPE (Mean Average Percentage Error) of the load forecast, for 1 GigaWatt of peak load, can save a market player about:
And these numbers don’t even take into account the savings on data scientist capacity.
Explanatory variables in energy consumption use cases include historical load data, in different levels of aggregation, as well as real-time measurements. These variables are supplemented by weather data, calendar information, day/night differences, In this use case, explanatory variables can include weather related data, wind speed in particular, complemented by more technical information such as the wind turbine type(s). TIM’s output consists of the forecasted wind production in the same unit of measurement (typically kWh or MWh) and granularity as the input data, over the desired forecasting horizon, production data…
TIM’s output in turn consists of the desired consumption forecast, in the same level of aggregation as the input target data, on short term, medium term and long term horizons.
Are you interested in a walk-through scenario of this type of use case? Then take a look at our solution template on this use case! You can find it under Electricity Load.
Elke Van Santvliet - Tangent Works
Although ecological and quite popular, wind production is a volatile source of energy. Besides great opportunities for balancing the grid and forecasting production, this use case also involves a lot of predictive maintenance. Wind production use cases rarely centre around a single windmill or even a single wind farm, instead often involving a large portfolio of wind assets. The larger the portfolio, the more difficult to manage and obtain an optimal dispatch and exposure to the electricity market.
It is worth mentioning that mixed portfolios of solar and wind assets are common; don’t hesitate to take a look at this solar production use case.
TIM can contribute in this use case through automating and managing complex wind & solar modelling pipelines. Moreover, TIM allows for blended forecasts that unify high-quality intraday modelling and day(s) ahead modelling into a single API call. These forecasts are fully explainable and can take into account many additional variables, such as weather data, on top of historical values of the wind production. TIM accomplishes this in a scalable and accurate way, taking care to incorporate either current or expected data availability into the models it builds.
In this use case, explanatory variables can include weather related data, wind speed in particular, complemented by more technical information such as the wind turbine type(s). TIM’s output consists of the forecasted wind production in the same unit of measurement (typically kWh or MWh) and granularity as the input data, over the desired forecasting horizon.
Are you interested in a walk-through scenario of this type of use case? Then take a look at our solution templates on this use case! You can find them under Single Asset Wind Production and Portfolio Wind Production.
Elke Van Santvliet - Tangent Works
Many different parties are impacted by the production of photovoltaic plants, from owners to electricity traders to system regulators. This production has an impact on multiple domains, such as maintenance, trading and regulation strategies. However, the high short-term volatility in solar production makes balancing the grid a difficult task. Moreover, a single impacted party often has interests in a large portfolio of solar assets, which might consist of different sizes of plants at different locations. Inaccurate forecasts can result in significant financial penalties, whereas improvement of forecasting accuracy can lead to significant financial gains. Large portfolios with significant impacts require consistent and scalable forecasts.
Many parties are interested in mixed portfolios of solar and wind assets; if interested, take a look at this wind production use case.
TIM empowers users to intuitively execute and even automate this forecasting task by managing complex modelling pipelines and allowing for blended forecasts that unify high-quality intraday modelling and day(s) ahead modelling into a single API call. In addition, TIM works with fully explainable models, so users can easily understand which decisions are made and why.
Achieving a high accuracy isn’t the only challenge in these situations, though. These large portfolios of volatile assets might not always dispose of the same expected data availability. TIM can handle different data availability situations either by allowing the user to account for the situation in the relevant Model Building Definition or by building and deploying models ad hoc taking into account the current data availability situation.
Several different variables can be explanatory in this use case and should therefore ideally be included as inputs into the model building scenarios. These variables include weather related data such as the global horizontal irradiation (GHI) and the global tilted irradiation (GTI). Other factors cover the position of the sun, the GPS location of the PV plant(s) and the direct normal irradiance (DNI). Extensive domain knowledge can help identify possible explanatory variables that can be added to the input dataset.
The output values, i.e. the forecast, will contain the solar production in the same unit and intervals as the input data on the target variable, over the requested forecasting horizon. If desired, these output values can even be used as input for further risk analysis and optimisation.
Are you interested in a walk-through scenario of this type of use case? Then take a look at our solution templates on this use case! You can find them under Single Asset Solar Production and Portfolio Solar Production.