A dictionary for automation and digitalization
It can be difficult to navigate through all the concepts and abbreviations in automation and digitalization. That’s why we have made a simple overview.
It can be difficult to navigate through all automation and digitalization words. But we are here to help you!
Artificial Intelligence (AI) is the term for computer systems that are capable of performing tasks that previously required human intelligence. In automation, AI is often used in conjunction with Machine Learning, and uses algorithms to find patterns more easily in large amounts of data.
Application Programming Interface (API) is a programming interface used to exchange data between two different applications. This integration allows you to use simple, user-friendly applications to make changes, run processes or process data in larger context, such as a database.
Audit Trail refers to the registration of all incidents and procedures down to the level of detail. With Audit Trail you will have the opportunity to go back and see changes and other events in chronological order. When working with data, this means that you can see the logical path that connect the sequence of events.
Business Intelligence (BI) combines business analysis, data mining and visualization to make it easier for companies to make choices and changes based on data. In practice, this means that companies can use data to increase efficiency and quickly adapt to changes in the market.
Big Data are large, complex datasets that are difficult to analyze with standard data processing tools. When we talk about Big Data, we also often talk about new technology that makes it possible to analyze these large, complex datasets faster and more accurately.
Cloud Computing, or Cloud, refers to the provision of data services such as servers, databases, networks and software over the internet. The solution is often cost saving because you don’t need physical equipment. In addition, network backup provides good data security.
Data Mining combines statistics, AI and Machine Learning to uncover patterns and valuable information in large datasets. With Data Mining, you can either use the target dataset, or predict future outcomes using Machine Learning algorithms. In this way, raw data is transformed into useful information.
Distributed Control System (DCS) is a computer-controlled control system used in complex control processes within limited areas, such as a single plant or factory. The control systems work “on the edge” and consists of several controllers that are connected.
A digital twin is a digital copy of a physical product or process. With the help of sensors, physical properties can be reproduced in digital form. The digital twin can provide information that can help optimize the product or process. Digital twins can also be used as prototypes to save costs.
Edge Computing, often referred to as Edge, is an open IT architecture that means that systems, equipment or software run close to the data source. Here, data is processed by the device itself, a local computer or server instead of being transferred to a data center. The purpose of running systems “on the edge” is improved response time.
Enterprise Resource Planning (ERP) is a software for management and administration of a company, such as production, warehousing and finance. ERP is not a single product, but an integration between the company’s systems. ERP systems contain real-time data, a common database and a unified user interface.
DataOps comes from the word Data Operations, and is a new approach to data analysis. Instead of one-to-one integrations, DataOps handles hundreds of machines with thousands of data points with one set of standard models. Data from different sources or systems can be compiled, contextualized and prepared “on the edge”.
A Data Scientist collects Big Data sets, and develops models for analyzing, processing and modeling the data. The results are then interpreted to create action plans for the company or the entire enterprise. Most businesses will need Data Scientists to be able to work with Big Data.
Human Machine Interface (HMI) is the interface between machines and humans. The term is mainly used for graphical user interfaces, in the form of software that allows the operator to control and manage industrial processes. Today, HMI is designed for the operator, with the intention of immediate overview and quick action.
Industrial Internet of Things (IIoT) is the industrial version of IoT, and is about how sensors, systems and equipment can be connected using industrial applications on computers. This connection allows for data collection, exchange and analysis that can be used to optimize the production.
Internet of Things (IoT) refers to various objects that equipped with electronics, software, sensors, actuators and/or networks that allow objects to connect to each other and exchange data. In the industrial context, we use the term IIoT (Industrial Internet of Things).
Industry 4.0 is an ongoing process that describes the fourth industrial revolution: Traditional production and industrial practices are automated with smart technology. Industry 4.0 involves connecting systems devices and people, better data access and technology that helps people solve challenges.
Inputs/Outputs (I/O) is the term for communication between an information processing system (for example a computer or a PLC) and another system or human. Input is the data the system receives, and output is the data the system forwards.
Industrial PC (IPC) is a computer specifically made for industrial use. It involves technology that enables the IPC to run powerful applications that handle large amounts of data in harsh industrial environments, without sacrificing performance.
Information Technology (IT) involves the use of computers, networks and other physical devices to retrieve, create, process, secure, store and exchange all forms of electronic data. An IT system is a combination of hardware and software.
Client/server is an IT architecture that is characterized by different software components computing via a protocol where one component requests services from the other. The solution is especially common when programs communicate through a computer network.
Key Performance Indicators (KPIs) denote a series of measurements used to chart the performance of the entire business or specific products and processes. Data collection and analysis software can provide you with insights in the form of KPIs.
Lean refers to a production method that involves identifying and minimizing waste and wrecks, while optimizing productivity. The result is products of better quality, faster production time, reduced use of resources and lower production costs.
Laboratory Information Management Systems (LIMS) is a software used in laboratories to administer and track samples, instruments and other laboratory functions. Data can be extracted, analyzed and stored in electronic laboratory books.
Machine Learning (ML), a branch of AI, is a method of data analysis that automates the construction of analytical models. Intelligent systems built by Machine Learning algorithms can learn form historical data, identify patterns and make decisions with minimal human intervention.
Manufacturing Execution Systems (MES) is a collection of solutions that track and document the entire production process in real time. The solutions are driven by insights and intelligence from data integration, IIoT, Machine Learning and analysis, and enable “lean” production. The goal is comprehensive performance management for the entire business.
Modbus is a data communication protocol that is often used to connect SCADA systems with PLCs or RTUs. The communication is client/server based. Modbus har become a standard communication protocol for connecting industrial electronic devices.
Mean Time Between Failures (MTBF) is the assumed time between faults in a system during normal operation, also referred to as average service life. MTBF can be used to compare different systems, but must be understood as an average value, not a quantitative identity. The term is used about repairable systems.
Mean Time to Repair (MTTR) is an average time measurement that will be used on the maintenance and repair of components of devices. You can use the measuring unit from the time the fault occurred until the equipment is back in operation, or from the time the repair was started until the equipment is back in operation.
MQ Telemetry Transport (MQTT) is an open network protocol for communications between machines, as well as machines and the cloud. The protocol is simple, and usually runs TPC/IP, with clients requiring minimal resources. MQTT’s scalability makes the protocol particularly suitable for connection to IoT and IIoT devices.
Overall Equipment Effectiveness (OEE) is a key number that tells how well your machine, line or entire business is utilized. OEE is divided into three categories: Availability, Performance and Quality. These map different parts of the production and give a complete picture of the production efficiency.
Open Database Connectivity (ODBC) offers a standard API for accessing database management systems (DBMS), independent of operating systems. An application written with ODBC can be transferred to other platforms, both on the client and server side, with a few changes in code.
Open Platform Communication (OPC) is a secure communication platform for data exchange. The platform makes it possible to connect, control, monitor and control different devices through a single source. OPC is platform independent and ensures a seamless flow of information between systems from different vendors.
OPC Data Access, Alarm & Events, Historical Data Access, Data Exchange are the classic OPC standards, based on Microsoft technology. Each of the standards has different functions, and in some cases an OPC server may have several of the standards implemented.
OPC Unified Architecture (OPC UA) is the latest OPC standard, and the one we most often use today. OPC UA is platform independent and can run on Apple, Windows and Linux. Network communications is easy. OPC UA is mainly used to build a bridge between different DA servers, called tunneling.
Operational Technology (OT) is hardware and software that monitors, manages and controls industrial equipment, processes and events. The term demonstrates the differences between IT systems and industrial control environments. Examples of OT are SCADA systems, PLCs and DCSs.
Programmable Automatic Controller (PAC) system has functions similar to PC-based controllers and makes it possible to provide more complex instructions for automation equipment. Often the PAC comes as a smaller device, similar to a PLC. PAC can monitor, manage and adjust production “on the edge”.
Proportional Integral Derivative Controller (PID) controller is used in industrial control applications to regulate electrical and mechanical appliances such as motors, pumps, heating and cooling elements, valves and fans. A digital PID controller reads sensor signals and notifies the user of any deviations.
Programmable Logic Controller (PLC) responds to switches and sensors, and can control machines, systems and processes. PLCs are often delivered as small, robust devices, designed to handle industrial environments. PLCs can communicate with other PLCs, external I/O and HMI.
Redundancy involves duplication of critical components in a system to increase operational reliability. If a component or function stops working, you will have available backup. I/O modules, controllers, sensors, actuators and industrial networks can be redundant.
Representational State Transfer (REST) is a data architecture used to deliver standards between computer systems over the internet, making it easier for systems to communicate with each other. In applications that use REST, components can be managed and updated during operation without affecting the system.
Remote Terminal Unit (RTU) is a microprocessor-controlled electronic device that connects objects in the physical world to a distributed control system or SCADA system for monitoring and control. An RTU has a processor, memory and storage, and can be used as an intelligent controller or as a main controller for other devices.
Supervisory Control and Data Acquisition (SCADA) is a term for systems that collect, control and monitor data, also called automation or control systems. SCADA is often used to monitor large areas, such as entire factories or facilities. Most factories and plants have some form of SCADA.
Simple Network Management Protocol (SNMP) is a standard protocol used on IP networks to manage and monitor machines and network equipment. With one set of standards, SNMP collects data from hardware and software, organizes the data and helps detect and analyze network errors.
Simple Objects Access Protocol (SOAP) is a protocol for exchanging structured information when implementing web services in computer networks. SOAP uses XML as the message format to communicate, and relies on information layer protocols, such as HTTP.
Store & Forward
Store & Forward describes technology where data or information is sent to an intermediate station where it is stored securely before being sent to the final destination. Store & Forward prevents data loss due to broken connections, downtime or other problems at the final destination.
Statistic Process Control (SPC) control is a methodology that uses statistical tools to predicts whether a process (for example, a production process) is under control. In this way problems can be foreseen, and measures can be taken before the errors result in deviations or wrecks.
Structured Query Language (SQL) is a query language used to communicate with databases. It is designed to manage data in a relational database management system and is particularly useful for managing structured data. Many of today’s database systems offer SQL as a control interface.
Transmission Control Protocol/Internet Protocol (TPC/IP), also known as The Internet Protocol Suite, is a group of communication protocols used on the internet and in similar computer networks. The protocols provide end-to-end data communication that specifies how data is to be packaged, addressed, transmitted, routed and received.
Tunneling is a communication protocol that makes it possible to send data from one network to another. This means that private network communications can be sent over a public network (such as the internet) through a process called encapsulation.
A thick client is a form of client-server architecture. Thick clients have an operation system and applications that can be used offline, and do not need continuous communication with network or servers. Offline features include storage and retrieval of data and programs or applications.
A thin client is a form of client-server architecture. Thin clients run resources stored on a central server instead of a local hard drive. In a multi-client environment, managing, upgrading and maintaining thin clients can be significantly more efficient and cost-effective.
Extensible Markup Language (XML) is a code language similar to HTML, but without predefined tags. Instead, you define your own codes specifically designed for your needs. The XML format is standardized and can be shared across systems or platforms, locally and over the internet.