-By Mr. Mandya Venkatram

Mr. Mandya Venkatram

Mr. Venkatram is presently a freelance trainer, facilitator and consultant in Technical, Management and Business Excellence related areas, for corporates as well as Management Development Institutes.

Automated production has been generally accepted as reliable, efficient, transparent and predictable provided it is integrated end-to-end. This is the expectation of industrial automation, which is a core component of the Industrial Internet of Things (IIoT) and manufacturing automation has become a popular process. AI (artificial intelligence) and remote communication, obsolescence is likely to creep into this practice in coming years. This is predominantly because opportunities are being created for manufacturers, worldwide through a number of new technologies.

It is quite natural and likely that when asked to identify and list such relatively nascent and related impacting technologies, related professionals will come out with a number of different items. These could include for example Edge Computing, Digital Twins, Augmented Reality, Breaking Moore’s Law, Automating Engineering Insights with Machine Learning, etc, and therefore dissimilar lists .In such a prevailing context, it is interesting to see what international market research agencies identify as the ones having the highest impact, particularly in the geography we are located in. Among them, one such international market research company by name Technavio has  released a new report identifying the top three biggest trends to influence the industrial automation control market, up to year 2020 as the following:
1. Cloud-based supervisory control and data acquisition (SCADA) systems.
2. Increased use of analytics.
3. Growing use of programmable automation controllers (PACs)
The Report further says that while the above is particularly valid for the Asia-Pacific Region, these trends will be also applicable globally.
Let us now take a closer look at each of these !!!

1. SCADA Systems: Accelerated Adoption

A SCADA system is used for remote monitoring and control of industrial processes such as electric power transmission and distribution, similarly transmission and distribution of water, compressed natural gas, etc., refining, fabrication etc. These systems use coded signals to communicate over channels with remote stations.

The general or conventional method of installation and use of such SCADA systems is as that within the facility, an operator sits and monitors/ controls applications using a human machine interface such as a monitor –keyboard or a touch screen monitor. The integration of cloud computing technology with SCADA systems enables operators to control applications via the Internet. Therefore, we are bound to see a growth in the acceptance and adoption of such SCADA systems due to their scalability, ease of updating and upgrading, and general use through the Cloud. What could however dampen the pace of such adoption is the fear that such cloud based SCADA systems could be infiltrated by external agents (hackers) and consequent security related issues that could negatively influence plant / system control and data privacy. However, as Cloud security systems become reinforced with superior strengths, features and increase in sophistication, such deterrence from adopting SCADA systems is expected to progressively diminish.


2. Identification of Errors and their Reduction through Analytical Software

Another emerging trend  that will significantly influence automation as predicted by the Technavio report is a significant growth in the adoption of data management and analytical software for the purposes of error reduction and decision making facilitation and assistance. Several recent techniques such as predictive modeling, optimization, statistical analysis, forecasting and their integration in software such as those used by SCADA and Advanced Process Control (APC) have greatly facilitated identification of errors and also prediction of possible errors. As a consequence of and also aligned with this development.

Automation system/equipment vendors have commenced incorporating relevant analytical tools into their SCADA systems and APCs. For example, let us take the case of thermal, wind and solar-powered energy generation plants in which sophisticated SCADA / equivalent systems are used. At times when an  alarm is generated in such plants, it is difficult to determine whether the alarms are genuine or false (as a result of a malfunctioning sensor or broken conductor bringing the signal from the field to the input modules) Ignoring a genuine alarm could at times be unsafe and even catastrophic leading to severe tangible damages such as loss of life, plant, consequent damages to neighboring installations etc. Also intangible damages such as penalties, lawsuits, loss of reputation and loss of consequent business opportunities, etc.

On the other hand, assuming a “Cry Wolf” or a spurious alarm to be genuine and consequently allowing the plant to be partially or completely shutdown as per corresponding logic could result in expensive shutdowns, loss of business opportunities, penalties in case committed production levels and deliveries are not met, etc. In such cases these avoidable shutdowns lead to significant reduction in productivity, efficiency and profits endangering the sustainability of the business itself.

What then is the solution? Predictive analysis is the magic answer to such dilemma like situations. By analyzing related parameters and the computed status of relevant binary logic devices, the correct status of the relevant signal and consequent alarm, i.e. “genuine” or “spurious” can be determined. Then the correct consequent action based on plant logic , can be safely and confidently taken, thus raising the reliability and efficiency of such SCADA/ equivalent systems. What about errors related with.”Analog value” signals, such as temperatures, pressures, etc.? In the case of significant parameters, to enhance the availability and reliability of such analog sensors, often two independent and identical sensors measuring the same parameter at the same plant location and time in are resorted to and the signal processed in a MIN select or a MAX select mode. In case one of the sensors is detected erroneous, it is taken out of scan, ignored, and not taken into account. The signal from the other healthy sensor is then accepted in a 1 out of 1 configuration and the signal from this healthy sensor accepted as correct.

For critical and revenue generation influencing inputs (such as main steam temperature – for thermal power plants ) often three sensors (with only con- trolled inter-sensor deviation levels accepted) are employed in a 3 out of 3 configuration. Then, either the MIN select or MAX select or MEAN or middle value mode, taken as the accepted value. In the case of a thermal power plant, for example, the main steam temperature value is used to compute the heat rate or the efficiency of the thermal plant generating unit and consequently the “cost of generation”. Based on this input, the regulating authority  determines the “cost of generation” and therefore the electric power tariff, for that period. Therefore the criticality of the actual main steam temperature measured and acquired.

In case one of the three temperature transmitters used is detected erroneous, it is kept out of scan and the system automatically reconfigured as  a two sensor system. Then in such a case, which value of the steam temperature measured should be accepted? The higher one or the lower one? There will be multiple revenue influencing consequences in either case either favoring the plant operator or favoring the regulator with consequent influence on returns for the plant opera- tor. One solution that can be resorted in such situations  is to use the actual plant system math modeling, values of other related plant parameters of the mathematically modeled plant / system and predictive analytic techniques. Then the mathematically “most reasonable” value is determined and accepted. 


3. PACs (Programmable Automation Controllers) And Their Accelerated Adoption

With the advent of digital computers and their industrial application, it is interesting to observe why the computer based Data Acquisition Systems were generally accepted readily, while the Direct  Digital Control or computer based plant or process control was not. To the plant technicians then, computers were highly complex, sophisticated devices which had to be housed in dust proof, air-conditioned environments with temperature and relative humidity being maintained within stringent limits. Further, programming then was an esoteric science that was beyond the comprehension of the lay workman. All the above factors culminated in associating industrial computers as highly delicate, sophisticated systems, prone to malfunction or failure if the above environmental conditions were not met, and there- fore “too risky” to be depended upon to independently run the plant.Therefore the reluctance of workmen to accept direct digital control or a digital computer directly controlling the plant or production process.

On the other hand, computer based Data Acquisition Systems were less  reluctantly accepted because these were bringing only one way signals from the field to the control room for display and analysis;, back up hardware based indicators, recorders were available for critical parameters and the operation of the plant was not “endangered” even if the computer based Data Acquisition System became unavailable.

The PLC (Programmable Logic Controller) evolved and was readily accepted in industry owing to its use of ladder logic programming code. This  bore  a one to one resemblance with the familiar electro-mechanical relay and relay contact based control schematics. In case of a command not being executed, it was simple, even for a lay technician not familiar with computers or programming, to look at the monitor and see which contact was in the erroneous state, therefore holding up the flow and thus preventing execution of the command. Besides PLCs were “ruggedized” – could be kept on the shop floor and did not need highly controlled environments to function in.

The PLC grew in popularity and found wide acceptance across industries. The number of input and output signals it could handle kept increasing. Also the functions it could perform kept growing progressively. Subsequently, PLCs were also manufactured across a range of number of input-output signals it could handle, from the very small to the very large and also became less and less expensive. The number of organizations manufacturing PLCs grew in numbers and was also spread around the world. PLCs gained widespread popularity and acceptance and found application right from gigantic enterprises to the humble home automation.

The ladder logic  programming is simple in concept, robust, intuitive owing to its parallel relationship with the systems it controlled, relatively easy for  one to learn and master, etc. This feature is offered by almost all PLC manufacturers. However, this very ladder logic programming has become a millstone around its neck for PLC manufacturers. It takes time to be programmed, is tedious to troubleshoot malfunctions with large logic schemes involving multiple screens of ladder logic. Further, although various manufacturers offer ladder logic programming with their respective PLCs, the programmes used are unique to individual PLC manufacturers. In other words, the ladder logic programmes used by different manufacturers are unique and not interchangeable. This restricts  a manufacturer to a suite of programmes from a single manufacturer. Picking and choosing programmes from multiple manufacturers is not possible. In fact it is often said that using ladder logic based PLCs for plant control and automation is like using a desktop computer only as a typewriter. This is because besides logic, the types and nature of functions it can perform are very very small as compared to programming with a high level language such as C, C++ etc. This is where the PACs (Programmable Automation Controllers) come in.

PAC’s comprise the PLC or Programmable Logic Controller element together with the elements of a PC-based controller. Using high level languages such as C,C++, etc., with PACs, and some PLCs, a manufacturer can allow logic controllers from different providers to communicate – this flexibility permits the best choice of best-products from  different manufacturers if one brand doesn’t meet all application requirements. Further, standard communication protocols can be used by PACs which facilitates downloading and transfer of information between various connected systems. The technology used is proven reliable considering the aspects of scalability of production and the range or flexibility of products manufactured.

The range and versatility of applications that it can be used for are practically unlimited. In fact some programme modules have “steam tables” built in. When used for thermal power plant applications related to thermodynamic computations, one realizes how far the PAC has taken the PLC. There are equally good examples of such  uses of PACs in other industries and disciplines. Naturally, the use of PACs is bound to grow universally, by leaps and bounds.

We can look forward to exciting times ahead with a number of new technology trends looming over the horizon. Whichever gains traction first, it is entirely our responsibility as well as an opportunity for us to prepare ourselves proactively and adequately for it, derive maximum advantage from it and with it, make ours a better world to live in for all.