Category: Model Optimisation Techniques

Sparse Coding

Sparse coding is a technique used to represent data, such as images or sounds, using a small number of active components from a larger set. Instead of using every possible feature to describe something, sparse coding only uses the most important ones, making the representation more efficient. This approach helps computers process information faster and…

Normalizing Flows

Normalising flows are mathematical methods used to transform simple probability distributions into more complex ones. They do this by applying a series of reversible steps, making it possible to model complicated data patterns while still being able to calculate probabilities exactly. This approach is especially useful in machine learning for tasks that require both flexible…

Proximal Policy Optimization (PPO)

Proximal Policy Optimization (PPO) is a type of algorithm used in reinforcement learning to train agents to make good decisions. PPO improves how agents learn by making small, safe updates to their behaviour, which helps prevent them from making drastic changes that could reduce their performance. It is popular because it is relatively easy to…

Temporal Difference Learning

Temporal Difference Learning is a method used in machine learning where an agent learns how to make decisions by gradually improving its predictions based on feedback from its environment. It combines ideas from dynamic programming and Monte Carlo methods, allowing learning from incomplete sequences of events. This approach helps the agent adjust its understanding over…

Ghost Parameter Retention

Ghost Parameter Retention refers to the practice of keeping certain parameters or settings in a system or software, even though they are no longer in active use. These parameters may have been used by previous versions or features, but are retained to maintain compatibility or prevent errors. This approach helps ensure that updates or changes…

Memory-Constrained Inference

Memory-constrained inference refers to running artificial intelligence or machine learning models on devices with limited memory, such as smartphones, sensors or embedded systems. These devices cannot store or process large amounts of data at once, so models must be designed or adjusted to fit within their memory limitations. Techniques like model compression, quantisation and streaming…

Cognitive Load Balancing

Cognitive load balancing is the process of managing and distributing mental effort to prevent overload and improve understanding. It involves organising information or tasks so that people can process them more easily and efficiently. Reducing cognitive load helps learners and workers focus on what matters most, making it easier to remember and use information.

Feature Engineering

Feature engineering is the process of transforming raw data into meaningful inputs that improve the performance of machine learning models. It involves selecting, modifying, or creating new variables, known as features, that help algorithms understand patterns in the data. Good feature engineering can make a significant difference in how well a model predicts outcomes or…