What is Data Analytics as a Service?

Introduction

Data Analytics is very diverse in the solutions it offers. It covers a range of activities that add value to businesses. It has secured a foothold in every industry that ever existed. Eventually carving a niche for itself known as Data Analytics as a Service (DAaaS)

DAaaS is an operating model platform where a service provider offers data analytics services that add value to a clients’ business. Companies can use DAaaS platforms to analyze patterns within the data using ready to use interface. Alternatively, companies can also outsource the whole data analytics task to the DAaaS providers.  

How does DAaaS Help Organizations?

Have you ever wondered how CEOs make big decisions? A potential game-changer that makes large companies trade high on the NYSENASDAQ, etc. A surprising statistic shows that organizations rely on intuition-based decision-making. High stake business decisions are made solely based on gut feelings and speculative explanations. However, there is an element of uncertainty associated with such decisions as long as that uncertainty is assessed. Data Analytics offers solutions on how data can be used to mitigate the associated risks and enable well-grounded decision-making. 

Organizations collect data constantly on competitors, customers, and other factors that contribute to a business’s competitive advantage. This data helps them in strategic planning and decision-making. But the million-dollar question is whether organizations choose to build data analytics capabilities or outsource to Data Scientists with deep technical expertise. The answer to this question lies in the digital maturity of the organization. Most organizations prefer focusing on the core businesses rather than donning multiple hats at the same time. More and more organizations are turning to outsource their Data Science work to make most of their data. DAaaS furnishes the most relevant information extracted from data to help organizations make the best possible data-driven decisions. 

Why Organizations Should Outsource Data Analytics

For many reasons, organizations, particularly start-ups, are turning to outsourced Data Analytics. Outsourcing has long been undertaken as a cost-cutting measure and is an integral part of advanced economies. Some of the main reasons why companies should opt for outsourcing Data Analytics include: 

  • Organizations can focus on core business.
  • Outsourcing offers flexibility as the service can be availed only when it is required. 
  • Organizations don’t have to maintain a large infrastructure for data management.
  • Organizations can advantage from high-end analytics services.
  • Outsourcing has lower operational costs.
  • It improves risk management.

What Can DAaaS Do for You?

 Data Import

Data import is the first step towards building actionable insights. It helps organizations import data from their systems into the DAaaS platform. Data is an asset for organizations as it influences their strategic decision-making. Managing data is vitally important to ensure data is accurate, readily available, and usable by the organization. 

Translate Data into Actionable Insights

Data is useful only when it is acted upon to derive useful insights that add value. Connecting and joining dots between data is important to put the facts and figures together. Data is nothing if the dots between them can’t be connected. The outcome of connecting and joining helps us answer one of the following bottom-line questions. 

  1. What happened? Descriptive Analysis
  2. Why happened? Diagnostic Analysis
  3. What is likely to happen? Predictive Analysis
  4. What should be done? Prescriptive Analysis

Testing of ‘Trained Models’

Testing the accuracy of a model is the primary step in the implementation of the model. To test the accuracy of the model, data is divided into three subsets: Training Data, Testing Data, and Validation Data. A model is built on the training dataset that comprises a larger proportion of the data. Training data is subsequently run against test data to evaluate how the model will predict future outcomes. Validation data is used to check the accuracy and efficiency of the model. The validation dataset is usually the one not used in the development of the model. 

Prediction and forecasting using ‘Trained Models’

Future events can be predicted using analytical models that has come to be known as predictive analytics. The analytical models are fit (also known as trained) using historical data. But such models constantly add data and eventually improve the accuracy of their prediction. Predictive analytics has been using advanced techniques like Machine Learning and Artificial Intelligence to improve the reliability and automation of the prediction. 

Deploy Proven Analytical ‘Models’

Training a model is not quite as difficult as deploying a model. Deploying a model is the process of utilizing the trained model for the purpose it was developed for. It involves how the end-user interacts with the prediction of the model. The end-user can interact with the model using web services, a mobile application, and software. This is the phase that reaps the benefits of predictive modeling adding value to the business needs. 

Conclusion

Data Analytics as a Service (DAaaS) companies enables access to high-tech resources without actually owning them. Organizations can reach out to DAaaS providers for their services only when it is required, eventually cutting huge costs on maintaining Data Analytics infrastructure and rare to find Data Scientists. This has enabled us to usher into a new world of the Gig Economy. 

Let’s Excel Analytics Solutions LLP is a DAaaS company that offers a solution to all your Data Analytics problems. 

Curious to know more?

Internet Of Things-Few Insightful Facts

Introduction

The internet has revolutionized our modern society. It has simplified everything that we do. It has brought us all the good things of the world at our fingertips. There has been a wave of internet transformation lately. The traditional internet has evolved into the Internet of Things(IoT) by convergence into diversified technologies. This evolution has broadened its applications beyond general consumer usage and has driven dramatic changes at the industrial platforms. This blog tries to explain the basic idea behind IoT and its applicability in diverse fields.

What is the Internet of Things?

IoT is defined as the network of objects (IoT devices) embedded with computing devices that enable them to exchange data over the internet. These objects range from general consumer items to industrial applications. The IoT for industrial applications is also known as the Industrial Internet of Things (IIoT).  

How does the IoT work?

An IoT device is comprised of three main components: sensor, microprocessor, and communication hardware. The sensor constantly collects data from the environment. The microprocessor analyzes the collected data using machine learning algorithms. And the communication hardware is used to communicate with other IoT devices. Most of the IoT devices are controlled remotely through an app or software.

Applications of IoT

  • Home improvement devices

IoT has realized the concept of smart homes. Most of the home appliances can be programmed remotely using IoT features. This has enhanced the quality of human life significantly. It includes air conditioning and lighting systems, alarm and home security systems, refrigerators, robotic vacuum cleaners, and TVs, etc., all of which can be remotely controlled by an app installed on a smartphone.

  • IoT in industrial manufacturing

The implementation of IoT has ushered the manufacturing industry into a new era of smart factories. It has numerous applications in manufacturing right from supply chain management through core manufacturing operations to distribution of the finished product. IoT-enabled manufacturing employs advanced sensors that collect data across all the critical operations of the production flow. This data is fed into cloud computing to get valuable insights that eliminate waste and unnecessary reworks and encourage continuous process improvements. It also alerts operators of any potential breakdowns and performs preventive maintenances to avert downtimes.

  • IoT in healthcare

Many wearable devices are available that monitor vital signs like blood pressure, heart rate, calorie checks, etc. These devices are used by Athletes to track the intensity of the workout sessions. These bands can also track the sleep patterns of individuals. Some of these devices have automatic fall detection systems that can predict the likelihood of fainting particularly in the case of elderly people. In case of a potential fall situation, these devices can send SOS signals to family members or ambulatory services.

The physicians have also been using IoT smart devices to track the health status of patients. The device can alert physicians of any need for immediate medical attention. In addition,  physicians can also track patient’s adherence to treatment regimes and monitor the prognosis of the treatment.

  • Smart cities

Smart-Cities employ advanced technologies to build highly efficient and sustainable infrastructure. For example, Smart lightings can drastically reduce energy consumption by switching ON and OFF when someone walks past them. Air quality tools continuously monitor air pollution data in real-time data and forecast emissions. Sensors installed on streets can give real-time updates on traffic management. 

  • IoT in the automotive industry

Nowadays, Autonomous cars are installed with IoT sensors to eliminate human errors and judgments during driving. This can avoid car accidents and makes driving safe and comfortable.

Advantages of IoT

  1. IoT automates processes and improves the quality of life.
  2. Enables the access of information from anywhere at any time in the world.
  3. It enables communication between devices without any human intervention.
  4. It Saves capital, resources, and time.
  5. Enhances efficiency and productivity.

Disadvantages of IoT

  • As IoT devices are connected over a network, it predisposes them to security attacks.
  • IoT devices continually share a substantial amount of data; it risks personal information of the users
  • IoT systems are very complex and are vulnerable to failures.

Future of IoT

According to IoT analytics, there were over 4.7 billion devices connected to the IoT in 2016. These figures are expected to grow up to 11.6 billion by the end of 2021. It is estimated that these numbers are anticipated to increase up to 21 billion by 2025. The total market value of IoT was worth $389 billion in 2020 and it is forecasted to rise to $1 trillion in 2030.

Conclusion

Internet of Things has transformed and simplified everything we do right from our household activities to commercial manufacturing operations. It has automated processes without human interventions. Owing to the vast applicability of IoT almost all the devices that we use are turning smart today.

Curious to know more?

Examples of Internet of Things

Introduction

The application of the Internet of Things (IoT) is widely spread in all walks of our life. It creates an ecosystem where everything is connected and controlled using an application. This connectedness and remote control simplify and make our lives better. The basic idea behind the concept of the Internet of Things is already discussed in our previous blog. In this blog, we are going to discuss the various applicabilities of IoT in diverse fields. We will be focusing on the examples of the Internet of Things and its applications in retail, food, healthcare, manufacturing, and general consumer.

Various Examples of Internet of Things are Mentioned Below:

Retail analytics

  • Internet of Things (IoT) has made possible the automation of warehouses. Warehouse automation refers to the process of movement of inventory in and out of the inventory without human intervention.  IoT has enabled automatic replenishment of inventory by raising purchase and tracking it as well. This feature is known as demand-aware warehousing. Autonomous mobile robots have taken over the physical movement of inventory from their location to the shipping area. This movement is automatically captured by ERP.
  • Imagine a situation when a long queue of customers is waiting for billing and the machine is down. This discourages customer turn-over and eventually affects the business. Alternatively, the breakdown of a deep freezer storing temperature-sensitive commodities could prove a huge loss to the business. To reduce such untimely downtimes, IoT could be leveraged to signal preventive maintenance needs in the first place.
  • IoT allows smart transportation of goods through GPS tracking and routing of trucks. It also solves the challenges posed by the transportation of temperature-sensitive goods. The temperature tracking could be performed in real-time and the potential risks could be mitigated before actually happening.
  • IoT uses video monitoring of customer traffic to get insight into potential buyers. If a customer is dwelling over the product, a store associate could be sent to attend to the customer to increase the likelihood of a sale. This video monitoring could also be used for training store associates. This feature also allows monitoring potential problems, like shop-lifting, and take timely appropriate actions.

Cold Chain in the Food Industry 

IoT has solved the basic challenges experienced by the food and beverage industry: 

  1. Food safety and traceability and
  2. Food wastage and cost.
  • IoT-based sensors in the cold chain provide temperature, humidity, and light monitoring that offers unparalleled safety and traceability right from the farms to the retail grocery stores.  All this data is available on the cloud and can be accessed anywhere over the internet. This forms an indispensable tool for an effective food safety strategy.
  • Most of the food wastes in the food industry are attributed to gaps in supply chain management. IoT has been proved to reduce food waste significantly by filling in gaps in the supply chain. IoT enables automated data collection that provides real-time insight into effective supply chain management. This could drastically reduce the costs incurred due to food wastage.

IoT Assisted Patient Care

  • Many IoT devices are available in the market that enables online monitoring of patient vital signs. This monitoring allows tracking the real-time health status of a patient and alerting patients in case of potential emergencies. In case of real emergencies, these devices can send SOS to hospitals and ambulatory services. 
  • There are very important application of IoT in elderly care. IoT-based automatic fall devices can predict the potential fainting or emergency and take immediate actions accordingly.
  • Physicians are also using IoT devices for monitoring patients away from the hospital. For example, the technology monitoring the pacemaker implanted into the patient can provide meaningful insights into its working and potential failures.

General Consumer IoT

  • Most of the appliances that we use today have IoT features. This includes air conditioners, refrigerators, vacuum cleaners, televisions, etc. All these devices are connected by a gateway and remotely controlled by a mobile app. For example, an air conditioner could be made to cool the house remotely before actually reaching home. 
  • The IoT can also be used to efficiently utilize energy at homes and reduce wastage in energy consumption. For example, smart home energy management systems allow real-time power tracking and adaptive energy usage to eliminate wastage. 
  • Smart smoke detectors can accurately indicate the location of the fire hazard and eventually activate fire extinguishing switches. The data collected using IoT improves visibility in high smoke during fire extinguishing exercises. 
  • Home security cameras with motion detection can deter and reduce burglaries. These devices also have two-way communication systems that can be used for effective pet care. 
  • Smart door locks use passcodes, fingerprint, and face detection for access into the house. These devices are connected to the internet and can also be controlled using a mobile app.

IoT in the manufacturing industry 

  • IoT-driven manufacturing employs advanced sensors to constantly collect data from the manufacturing process. This data is fed into cloud computing and acted upon by predictive analytics to get useful insights into the process. The potential excursions in quality attributes could be predicted well before it actually happens and corrective actions could be taken appropriately. This reduces process failures and averts huge losses to the business.
  • IoT also alerts operators of potential breakdowns and performs preventive maintenance accordingly to avoid downtimes. 
  • IoT also has huge applicability in managing supply chain and distribution channels of the raw materials and finished goods respectively. This feature makes the manufacturing supply chain very robust and reduces wastage. 
  • IoT enables the interconnectivity of different operations of the manufacturing process so that each operation can digitally communicate with the other. This improves the efficiency of a manufacturing process significantly.
  • It has realized the concept of remote manufacturing where an operation can be performed remotely using software or an app.

Conclusion 

Internet of Things used in combination with cloud computing and predictive analytics have transformed our devices into smart entities that can mark their own best decisions. These devices can also continually improve on the activities that they perform. These devices have made a huge impact on how and what we do. The future of IoT is very promising but each good thing comes with a price. The security issue is the biggest concern of IoT as the networks have to be open for the interconnectedness of the devices.

Curious to know more?

Statistical Process Monitoring

Statistical Process Monitoring of Critical Attributes

INTRODUCTION

Statistical Process Monitoring is useful to verify that the critical quality attributes are strictly controlled within the specified limits. Statistical control charts, process capability analysis, etc. are the most commonly used tools for process monitoring, root cause analysis, and process improvements required in CPV. In this article, we will be focusing on the statistical control charts that are frequently used for verifying that the process remains in a state of statistical control.

According to USFDA guidance on process validation (2011), one-time demonstration of process reproducibility before start of the commercial manufacture is inadequate but there should be a continual assurance of the reproducibility throughout the entire process life cycle. This continual assurance that the process remains in the state of validation during the commercial manufacture is known as Continual Process Verification (CPV). The ultimate objective of a CPV program is to identify the parameters for trends, detect signals for out-of-specification (OOT) events and implement a Corrective and Preventive Actions (CAPA).

WHAT IS A CONTROL CHART?

Control chart (Syn. Shewhart chart/Statistical process control chart) is a graphical representation of a process over time. It has a line for process average above and below of which are upper and lower bounds of the process respectively. The average line, and upper and lower limits are drawn from historical data. The process is considered to be in a state of statistical control as long as the current data falls between the upper and lower bounds of the chart. Figure below shows how a typical control chart looks like. 

Statistical Process Monitoring

Types of control charts

I. Univariate control charts for Process Monitoring

  1. Xbar and range chart
  2. Xbar and standard deviation chart
  3. Individual and moving range chart

II. Multivariate control charts Process Monitoring

  1. Hotelling T2 chart
  2. Multivariate Exponentially Weighted Moving Average (MEWMA) chart.
  3. Multivariate Cumulative Sum of Deviations chart.

UNIVARIATE CONTROL CHARTS

Univariate control charts are used to monitor processes with a single independent or multiple uncorrelated variables. Most common univariate control charts are discussed below.

  • Xbar and range chart is a pair of control charts that shows how process average changes over time (Xbar) and how the range (max – min) changes over time (R chart). The measurements performed at a given time constitute a subgroup. The upper and lower bounds are determined by multiplying the Xbar and R with an appropriate constant. These charts are very useful to find out if the process is stable and predictable.
X bar and R chart of diameter
  • Xbar and standard deviation chart is, again, a pair of charts that shows how process average and standard deviation changes over time. The average and standard deviation are used to estimate the upper and lower bounds of the process. These charts are used when subgroup size is large (n>10). It is believed that standard deviation provides better understanding of the process variations than the range.
X bar and S chart of Diameter
  • Individual and moving range chart is a pair of charts that shows individual values and their moving average over time; moving average is the average between two successive data points. The average value and bounds of the process are determined from the average and standard deviation of the individual values of the historical data.
Individual and Moving Range Chart of Diameter

As long as all the subgroups remain within the upper and lower bounds, the process is said to be in a state of statistical control, i.e., only common cause variations are present. If there is a pattern or the current subgroup falls outside the bounds of the process, the variation is caused by an assignable cause and has to be closely monitored and investigated.

However, univariate control charts could be misleading in case of multivariate processes particularly when the variables are dependent and correlated. In such cases, multivariate statistical tools are used to develop control charts. 

MULTIVARIATE CONTROL CHARTS

Univariate charts cannot be used for processes with two or more correlated variables. For that matter, multivariate control charts are used to determine how correlated variables jointly affect the process outcomes. For the limited scope of this blog, we have restricted our discussion to T2 Hotelling charts.

Hoteling’s T2 chart Process Monitoring

This control chart is named after Harold Hotelling who developed a statistics where multiple correlated variables could be plotted on a single chart, known as Hotelling’s T2 chart. The variables could be either individual observations or subgroups. Generally, historical data is used to develop a target statistic for comparison with the current or future data. However, if it is constructed using current data alone, the control chart is known as Phase I chart; whereas if it is constructed using historical data, the control chart is known as Phase II. This chart can detect excursions in means and identify associations between correlated variables. 

Hotellings T2 statistics Process Monitoring

Suppose x1 and x2 be two critical quality attributes that follow bivariate normal distribution. Let µ1 and µ2 be the mean values and σ1 and σ2 be the standard deviations of the attributes. Let ẍ1 and ẍ2 be the sample averages computed from the samples of size n. The covariance matrix of x1 and x2 is denoted by σ12. T2 statistics (χ02) of the distribution is given by the following formula:

This equation is used as a basis of creating Hoetlling’s control chart for the process means µ1 and µ2. This equation indicates that as long as the process means remain around µ1 and µ2, the value of χ02will be less than upper control limit (UCL).  If the mean value of at least one of the attributes is out-of-control limit, χ02 exceeds the UCL. The UCL is the upper α percentage point of the distribution with k degrees of freedom. The process monitoring is represented graphically as shown in figure below.

T square with all Principal Components

Model driven multivariate control charts (MDMVCC)

MDMVCC is a control chart that is built based on either principal components (inputs only) or partial least squares (inputs and outputs) models. It is used to monitor a multivariate process using T2 Hoellistic chart. In case of an out-of-specification event, the model enables identification of the root cause and contribution of the individual variables to the event.

T2 Hoellistic chart is plotted using the model led principal components or x scores of the prediction (in case of PLS) of the historical data. The limit (upper only) of the control chart is then determined using the following formula:

Then, the current data is incorporated into the chart for the comparability analysis. However, the current data has a separate upper limit calculated using the following formula:

The process is in a state of control as long as it stays below the Upper Control Limit (UCL) in the T2 chart. If a user detects an out-of-control signal, then it is possible to identify the root cause by leveraging the predictability of the model. The user can determine the individual contribution of each variable for the implementation of the appropriate corrective and preventive actions. 

CONCLUSION

In conclusion, statistical control charts are most important process monitoring quality tools that ensure the state of validation throughout the process and product life cycle. These charts have evolved dramatically over the years from just univariate tools to multivariate and statistical model-driven tools. These charts don’t help only in detecting the out-of-control signals but also identifying the assignable causes behind the signal. 

Curious to know more?

covid19 vaccines

Covid19 Vaccines: Safety And Efficacy

Introduction

Today everybody is interested to know about the safety and efficacy of the Covid19 vaccines. People want to know more about several vaccine options and the best vaccine that will protect them against Covid19. To answer these and many other questions, in this article, we describe basic science and working Principle behind the Covid19 vaccines.

What is a Vaccine?

Vaccine is a biological product that stimulates an immune response to a specific disease upon inoculation. This process by which a person becomes immune to disease through vaccination (inoculation) is referred to as immunization.

Vaccines: Are They Really Effective?

Data speaks louder than words! According to World Health Organisation (WHO), vaccines prevent more than 20 life-threatening diseases and save 2-3 million deaths every year from diphtheria, pertussis, tetanus, influenza, and measles.

Poliomyelitis, one of the most feared epidemics ever encountered by humans, is eliminated from most countries. WHO declared India a polio-free country in March 2014. This achievement is believed to have been spurred by the extensive pulse polio (vaccination) campaign.

How Efficacious Are COVID-19 Vaccines?

The efficacy of a vaccine is determined in a double-blind randomized controlled trial. It involves randomly assigning participant volunteers to either a treatment or control (placebo) groups; neither the participant nor the experimenter knows who is receiving what treatment. Both the treatment groups live their normal life and are monitored for COVID-19 symptoms over a specified period of time. The efficacy of a vaccine is calculated using the following formula.

v- cases among vaccinated
p- cases among placebo
N- number of placebo
n- number of vaccinated

For example, 43,000 volunteers were enrolled in Pfizer/BioNTech vaccine trials, who were distributed equally between the treatment and placebo groups; 170 in placebo and only 8 in the treatment group developed COVID-19 over the next several months. This corresponds to the vaccine efficacy of 95%.

So, what does 95% efficacy indicate? It doesn’t mean that only 5% of vaccinated people would develop COVID-19 but the vaccinated people are 95% less likely to develop COVID-19 upon exposure.

InnovatorBrand NameEfficacy
Pfizer/BioNTechCommirnaty95%
ModernamRNA-127394%
RDIFSputnik V92%
NovavaxCovovax89%
Bharat BiotechCovaxin81%
Oxford/AstraZenecaCovishield67%
Johnson & JohnsonJanssen COVID-19 vaccine66%
SinopharmaSinovac51%

Though the efficacy of all the vaccines are calculated using the above formula but the trials were performed under different circumstances. Scientists argue that the Pfizer/BioNTech and Moderna vaccine trials were performed during the period when the number of daily cases were significantly low. In contrast, Johnson & Johnson vaccine trials were conducted during the time when there were highest number of daily cases. The probability of the participant volunteers to contract virus was higher in Johnson & Johnson trials. Moreover, the virus strain (B.1.1.7) in Pfizer/BioNTech and Moderna trials is known to be less virulent than the variants (B.1.351 and P.2) present during Johnson & Johnson trials. Also the opportunity of a volunteer to get exposed to the COVID-19 is also highly variable.

There’s a myth around COVID-19 vaccine that vaccination averts all the possibilities of contracting the virus. No! That’s not right! There’s a possibility of infection even after vaccination but the vaccine trains our immune system how to deal with the virus. Consequently, the patient might experience mild to moderate symptoms but significantly reduces the chances of hospitalisation and deaths.

Is there any other way to determine which vaccine is the most efficacious? Yes! For that matter, all the vaccines have to be studied together in a double blind randomised clinical trial. But that’s not the need of the hour.

Are Covid19 Vaccines Safe Enough?

Most COVID-19 vaccines are very safe. It is perfectly normal to have mild side effects after vaccination. The common side effects include pain, redness and swelling. Other side effects are tiredness, headache, muscle pain, nausea, fever, and chills. These side-effects are short-term and are attributable to the mechanism of action of the vaccines.

However, it is important to mention that four cases of serious adverse reactions ( blood clotting with low platelets count) were reported in Norway after vaccination with AstraZeneca vaccine. Two of them died due to blood haemorrhage and other two were hospitalised. Similar adverse reactions were reported in Denmark, Italy and United Kingdom. However, European Medical Association (EMA’s) safety committee concluded that the unusual blood clot with low platelets count are very rare side effects of the vaccine. Only 1 out of 100,000 cases have chances of developing blood clotting reactions which is many folds low as compared to the COVID-19 related deaths.

Will Covid19 Vaccines Demonstrate Same Effectiveness Despite Mutations in the Virus?

As the data continues to be collected on the new variants of the COVID-19, most of the vaccines are reported to provide protection against virus because these vaccines elicit a broad immune response involving a range of antibodies and cells. Researchers claim that most of the vaccines are designed based on the spike protein that is less likely to be mutated. Conversely, as long as the  spike protein is not mutating, the vaccines would continue providing protection against COVID-19.

Conclusion

Herd immunity requires a large enough population to be vaccinated to prevent widespread transmission of COVID-19. The rate at which the population is vaccinated is going to play a crucial role in ending the deaths and emergency hospitalizations. It is important to reiterate that vaccines are very safe and mild short-term side effects are the common ones.

Curious to know more about Let’s Excel Analytics Solutions LLP?

Predictive Analytics in Healthcare

Introduction to Predictive Analytics in Health Care

Predictive analytics in Healthcare has had a huge impact on the healthcare system and finds a great many applications driving innovations related to patient care. The purpose of this blog is to apprise you of the wonders predictive analytics is doing in the patient-care.

“Predictive analytics is a branch of Data Science that deals with the prediction of future outcomes. However, it is based on the analysis of past events to predict the future outcomes.”

Talking about predictions has always fascinated mankind since time immemorial. Nostradamus set forth prophecies about catastrophes, disease, health, and well-being. Who would have known that this art of foretelling could transform into a Science, Predictive Analytics!

Advantages of the applications of predictive analytics in healthcare.

  • Predict curable diseases at the right time.
  • Predict pandemic and epidemic outbreaks.
  • Mitigate the risks of clinical decision making.
  • Reduce the cost of medical treatments.
  • Improve the quality of patient life.

“Patient-care has quite a transitioned from relying on the extraordinary ability of a physician to diagnose and treat diseases to the use of sophisticated and  state-of-the-art technology to provide innovative patient care”

For the matter of discussion, the applications of predictive analytics in healthcare have been divided into three aspects of patient care.

  1. Diagnosis
  2. Prognosis
  3. Treatment

Use of predictive analytics in medical diagnosis

Early detection of cancer

Many Machine Learning algorithms are being used by clinicians for the screening and early detection of precancerous lesions. QuantX (Qlarity Imaging) is the first USFDA approved ML breast cancer diagnosis system for predictive analytics. This computer-aided (CAD) diagnosis software system assists radiologists in the assessment and characterization of potential breast anomalies using Magnetic Resonance Imaging (MRI) data. Another image processing ML application is developed by the National Cancer Institute (NCI) that uses digital images taken of women’s cervix to identify potentially cancerous changes that require immediate medical attention.

Predisposition to certain diseases

Predictive analytics has a huge potential to determine the occurrence and predisposition of genetic and other diseases. This domain leverages the data collected from the human genome project to study the effect of genes linked to certain disorders. This is known as pleiotropic gene information. Many such models have been developed to determine the risk of manifesting diseases like osteoporosis, diabetes, hypertension, etc., in the later stages of life.

Prediction of disease outbreaks

The prediction of disease outbreaks that could eventually turn epidemic and pandemic is an indispensable tool for emergency preparedness and disaster management. Many lives could be saved if the outbreak of such diseases is known to us in the first place. However, the efforts of researchers modeling the spread of deadly diseases like Covid19, Zika, and Ebola viruses have yet to bear the fruit of success. The most probable reason could be the complexities in the data collection procedures and the highly dynamic nature of the pathogens like viruses.

Use of predictive analytics in disease prognosis.

Deterioration of patients in ICU

The predictive algorithms developed from continuous monitoring of the vital signs of a patient are used to predict the probability of the patient deterioration and need for immediate intervention in the next 1 hour or so. It is well established that early intervention has a huge success in preventing patient deaths. These predictive algorithms are also used in the remote monitoring of patients in intensive care units (ICU). The remote monitoring of patients, also known as Tele-ICU, is highly effective for aiding intensivists and nurses during situations like Covid19 when the healthcare system is pushed to the limit.

Reducing hospital stays

Prolonged hospital stay and readmission rates are very expensive in the patient’s pockets. The analysts are constantly looking at the patient data to monitor the patient prognosis to treatment that averts any unwarranted hospital stay. The effect of the future outcomes on patient health can also be determined to customize the patient-specific treatment modalities that prevent readmissions.

Risk scoring for chronic diseases

Predictive analytical applications have been designed that can identify patients who are at high risk of developing chronic conditions in the early stage of disease progression. The early detection of the disease progression allows better management of the condition. In the majority of the cases, the disease prognosis could be controlled to a great extent to have a significant effect on the patient’s quality of life.

Predictive analytics in treatment of diseases

Virtual hospital settings

Philips developed a concept technology of virtual hospital settings for predictive care of high-risk patients at their homes. This analytics employs data from the medical records of thousands of patients and the medical history of a particular patient (senior) to build predictive models that can identify the patients who are at risk of emergency treatment in the next month. Various devices have been developed that provide alerts for potential emergency treatment and are known as Automatic Fall Detection (AFD). The AFD collects data continuously from the patient’s movements in all directions (using accelerometer sensors) and uses the data to pick the subtle differences between normal gait and potential fall situations. This device has gained so much popularity that Apple added this feature to Apple Watch Series 4.

Digital twins

Another marvel of predictive analytics for patient care is digital twin technology. In this technology, predictive analytics, IoT, and cloud computing tools are used to develop a virtual representation of the human body. The virtual representation mimics the actual biochemical processes in the human body by constantly collecting data from millions of such patients. The data is modeled to project the possible cause of the patient’s symptoms and suggest the most viable treatment modality specific to the patient’s condition. The treatment modality recommended by the twin can be assessed virtually before implementation on the patient and possible complications can be known and averted in the first place.

Conclusion

The adoption of predictive analytics has ushered personalized and patient-centric transformations into the healthcare industry. However, its scope is not limited to patients alone, it has a huge potential to overhaul other areas of the healthcare system like administration, supply chain, engineering, public relations, and so on.

Interested in building predictive analytical capabilities in your organization?

DataSets

Searching DataSets for Data Analytics Projects and Self Directed Learning

Introduction

Technology has been evolving very expeditiously over the past decade. These advancements have set off a trend for learning with technology. To satisfy the learning needs, people are embracing self-directed learning. It is important to mention that as the world is preparing for the Fourth Industrial Revolution (I4.0), the workforce has to keep up with the advancements in technology. At the same time, there has been quite a buzz around the Machine Learning and Artificial Intelligence that forms the heart and soul of the I4.0. In other words, learning Machine Learning is the need of the hour.

Now that it is imperative to learn Machine Learning, there are three success mantras of mastering it: PRACTICE, PRACTICE, and PRACTICE. But the basic question that comes up in our mind is, what to practice on. A true dataset should be available to work on as if dealing with a real ML problem. In this blog, we will be discussing some of the most popular data repositories for extracting sample datasets for mastering Machine Learning skills.

Data, DataSet, and Databases

Before we begin, it’s important to clear the air by defining the basic definitions related to datasets.

What is data?

  • Data is a collection of information that is based on certain facts.

What is a dataset?

  • Dataset is a structured collection of data.

What is a database?

  • The database is an organized collection of multiple datasets.

The data which is used can be collected from various sources such as experimentations, surveys, polls, interviews, human observations, etc. It can also be generated by machines and directly archived into databases.

DataSets For Machine Learning Projects

DataSets

The choice of data collection is a very crucial step in the success of the Machine Learning program. The source of the datasets is equally important, as it is a matter of the reliability and trueness of the collected data. Some of the most popular data repositories that are required for acquiring Machine Learning datasets are discussed below.

KAGGLE  

This platform is owned by Google LLC and is a repository of huge data sets and code that is published by its users, the Kaggle community. Kaggle also allows its users to build models with the Kaggle datasets. The users can also discuss the problems faced in analyzing the data with its user community.

Kaggle also provides a platform for various open-source data Science courses and programs. It is a comprehensive online community of Data Science professionals where you can find solutions to all your data analytics problems.

UCI MACHINE LEARNING REPOSITORY

UCI Machine Learning repository is an open-source repository of Machine Learning databases, domain theories, and data generators. This repository was developed by a graduate student, David Aha, at the University of California, Irvine (UCI) around 1987. Since then, the Centre for Machine Learning and Intelligent Systems at the UCI is overseeing the archival of the repository. It has been widely used for empirical and methodological research of Machine Learning algorithms.

QUANDL

Quandl is a closed-source repository for financial, economic, and alternative datasets used by analysts worldwide to influence their financial decisions. It is used by the world’s topmost hedge fund, asset managers, and investment banks.

Due to its premiere and closed-source nature, it cannot be used for just practicing Machine Learning algorithms. But citing its specialization in financial datasets, it is very important to include Quandl in this list. Quandl is owned by NASDAQ, American Stocks Exchange based in New York City.

WHO

World Health Organisation (WHO) is a specialized agency of the United Nations Organisation headquartered in Geneva, Switzerland. It is responsible for monitoring international health and continually collects data related to health across the world. WHO has named its repository of data as Global Health Observatory (GHO). The GHO data repository collects and archives health-related statistical data of its 194 member countries.

If you are looking for developing Machine Learning algorithms on health-related problems, GHO is one of the best sources of data collection. It is a repository of a wide variety of information ranging from a particular disease, epidemics, and pandemics, world health programs, and policies.

Google dataset search is a search engine for datasets powered by Google. It uses a simple keyword search to acquire datasets hosted in the different repositories across the web. It hosts around 25 million publicly available datasets to its users. Most data in this repository is government data besides a wide variety of other datasets.

AMAZON WEB SERVICES (AWS)

Amazon Web Services is known as the world’s largest cloud services provider. AWS has a registry of datasets that can be used to search and host a wide variety of resources for Machine Learning. This repository is cloud-based, allowing users to add and retrieve all forms of data irrespective of the scale. AWS also enables data visualization, data processing, and real-time analytics to make well-informed decisions driven by data.

Conclusion

The human resources are prepping up for Workforce 4.0 by constantly acquiring new skills. Machine Learning is one of the most indispensable skills for tomorrow’s workforce. In today’s world of the digital revolution, information is available at our fingertips. The datasets for Machine Learning are also available as open-source and could be utilized to build algorithms for making informed decisions.

Let’s Excel Analytics Solutions LLP can support your organizational needs to develop digitalized tools for reinventing the business.


Curious to know more?

Digital Twin

Digital Twin: Introduction, It’s Working and Applications

What is a Digital Twin?

A digital twin is a virtual reflection of a physical object, generally driven by marvels of:

  • Internet of Things (IoT),
  • Cloud, and
  • Advanced Analytics.

A digital twin constantly collects real-time data and simulates it into the virtual replicate of the physical object. This virtual replicate then can be used to provide solutions to the problems experienced by the physical object.

The term ‘Digital Twin’ was coined by Michael Grieves in 2002. However, the concept of Digital Twin is as old as Apollo 13 (the 1970s). Though Apollo 13 was a failed moon mission, it hinted towards the inception of virtualization of the physical world. On its way, around 330,000 km from Earth; the Kennedy Space Centre received an SOS: “Houston, we have a problem”. The oxygen levels in the spacecraft had started declining fast. The dramatic rescue mission was started to bring the onboard astronauts back to the Earth. The key to this mission was that NASA had a physical replica of Apollo 13 on Earth. The Engineers performed a series of troubleshooting measures on the replica and came up with the best possible solution for bringing back the quickly declining Apollo 13. Rescuing all three members onboard was done successfully. This mission revolutionized the future of Space Exploration and it is also popularly known as a successful failure.  

“Houston, we have a problem”.

Unlike Apollo 13, all the replicas of current NASA programs are digitally and virtually monitored. NASA has been continuously using the real digital twin technology. It is used to solve the day-to-day problems encountered in the operation and maintenance of its space programs; without actually being physically present.

Another milestone in the history of digital twins was the launch of Predix by GE Digitals (a subsidiary of GE Electric). Predix is an Internet of Things (IoT) platform that secures cloud computing and data analytics. Used for improving the operational efficiencies of the machines. In 2015, Collin J Paris, Vice President of GE Global Research Center; demonstrated to the world:

  • how a computer program could predictively diagnose malfunctions in the operation of a steam turbine and,
  • even could perform the maintenance activities remotely.

GE has been continuously monitoring hundreds of such turbines using their digital twins for over a decade now.

Working of Digital Twin

  • The physical object, also known as an asset, is designed to have many, sometimes hundreds, of sensors. These sensors capture real-time data (about almost everything) and send it across to its digital twin.
  • The digital twin analyses this useful information. Further, mixes this information with the hundreds of other similar assets, using:
    • the Internet of Things (IoT),
    • cloud connectivity, and
    • predictive data analytics.
  • Additionally, the information shared with the digital twin is simulated to the various design features of the asset.
  • The simulation is used to answer two important questions viz.,
    • What could go wrong?
    • What could be done about it?
  • This knowledge is used to build a learning platform that makes digital twins smarter every time additional information is added.

Applications of Digital Twin

Use of digital twins in patient care

Philips is pioneering on the concept of what is referred to as the virtual representation of a patient’s health status, i.e., each patient would have a digital twin that enables the right type of treatment in the right way and at the right time. For example, if a patient presents with a particular symptom, its digital twin uses medical diagnosis data in combination with the patients’ medical history along with a variety of medical information available to build a digital model that recommends the patient-specific treatment modality with the best possible outcome. The digital twin also enables simulation of the treatment modality on the patient before implementing the procedure on the patient in the real case scenario. During the performance of the procedure, it ensures fidelity of the procedure and can even predict any unforeseen complication that can be averted in the first place. Moreover, all this information is stored in the cloud and can be retrieved anywhere at any time.

Use of digital twins in manufacturing 

The digitalized twin of a manufacturing process uses IoT sensors that collect real-time process data continuously. The IoT sensors enable uninterrupted monitoring of the process. This increases the overall performance of the manufacturing process. Continuous monitoring also allows anticipation of the maintenance needs through the use of advanced analytics. This could reduce the possible process outages and downtimes that save millions of dollars. The amalgamation of advanced analytics and IoT can be used to manage the performance of the manufacturing process and which, in turn, improves the quality of the final product. It is important to note that the digital twin of a manufacturing process is not a single application but hundreds of interconnected applications. The communication between all these applications puts the process into a state of control.

The virtualization process is also taking over the most vital component of the manufacturing industry, i.e., supply chain management. The digital twin of the supply chain can automate the organizational processes. The twin can automate the purchasing and tracking of the assets and consumables based on the anticipated usage. If there is a shortage of raw material, the twin can assess the possible impacts on the operations and also offers the best-case scenario and solutions. This makes an organization prepared for overcoming the logistic challenges and hence improve the overall productivity of the organization

Future of Digital Twin Technology

The digital twin technology is rapidly expanding its applicability in almost every industry and, in fact, almost everywhere. Due to the adoption of the fourth industrial revolution, Industry 4.0, the market of the digital twin is expected to grow enormously. The global market of digital twins was valued at $3.1 billion in 2020 and is projected to grow $48.2 billion by 2026 at a Compound Annual Growth Rate (CAGR) of 58%.

The outbreak of COVID 19 has upheld the implementation of digital twins in business models, particularly in the biotechnology and pharmaceutical industries. The industry is gearing up to upgrade the existing infrastructure and adopt the digitalized technologies to avoid crippling losses due to frequent lockdowns. The Governments are also very keen on adopting the technology as can be seen in the design of smart cities across the world. The smart city initiative of Singapore is the best-fit example for the application of digital twin technology. This model combines different technologies to develop a digital version of the city’s resources, processes, and procedures. The digital version of the city enables superintend of the city using a simple computer program.

Conclusion

The new normal of the pandemic has redirected and reinforced the adoption of Digital twin technologies into every aspect of our lives. Digital twin technology is going to be a game-changer in the fields like continuous manufacturing. There are innumerable advantages of the adoption of the technology like cost leadership, environmental sustainability, economic stability, energy efficiency, etc. This is going to change the way our businesses have ever been managed. Let’s Excel Analytics Solutions LLP can support your organizational needs to develop digitalized tools for reinventing the business.


Curious to know more?


Chemometrics and How to Use It?

Introduction

Chemometrics” is a combination of two words “chemo” and “metrics” which signifies the application of computational tools to Chemical Sciences. Coined by a Swedish Scientist, Svante Wold, in 1972. Later in 1974, Svante Wold and Bruce Kowalski founded the International Chemometrics Society (ICS). ICS describes chemometrics as the chemical discipline that uses mathematical and statistical models to
a) design or select optimal measurement procedures and experiments, and
b) to provide maximum chemical information by analyzing chemical data.

How does Chemometrics help design optimal experiments

Classical chemistry depends on the conventional One-factor-at-a-time (OFAT) for building on the understanding of the process chemistry, performance of the process, and product characterizations. However, these conventional techniques suffer from many drawbacks such as:

  • OFAT studies are time-consuming and need a greater number of experimental
  • It does not give any information about potential interactions between the two or more factors, and
  • OFAT studies may or may not give the optimal settings for the process or the product attributes.

The chemometrics, in turn, employs multivariate mathematical and statistical tools in combination with computational techniques to investigate the effect of multiple factors on the optimality of the process and product attributes. The multivariate data is modeled into a mathematical equation that can predict the best optimal settings for the process and the effect of the excursions of the process parameters on the process performance and the product quality.

The outcome of the multivariate investigation allows identification of the multidimensional design space within which the process is not impacting the process performance and product quality attributes. Moreover, multivariate strategies cover multiple process insights into a single multivariate design of the experiment. The adoption of the multivariate design of experiments offers multiple advantages over the conventional OFAT like:

  • Reduces the product development timelines significantly,
  • Significantly reduce the product development costs in a highly competitive market.
  • Maximizes the total information obtained from the experiment.

How does Chemometrics help derive maximum information from the chemical data?

The multivariate analysis strategy in the analysis of the chemical data starts with the pretreatment of the chemical data, also known as data preprocessing. It involves the approaches, where:

  • The data is scaled and coded,
  • Cleaned for outliers,
  • Checked for errors and missing values, and
  • Transformed, if need be, into a format that is explicitly comprehensible by the statistical and mathematical algorithms.

After the preprocessing of the data, the chemometric tools look for the patterns and informative trends in the data. This is referred to as pattern recognition. Pattern recognition uses machine learning algorithms to identify trends and patterns in the data. These machine learning algorithms, in turn, employ the historical data stored in the data warehouses to predict the possible patterns in the new set of data. The pattern recognition ML tools use either supervised or unsupervised learning algorithms. The unsupervised algorithms include Hierarchical Cluster Analysis (HCA) and Principal Components Analysis (PCA) whereas supervised algorithms have K Nearest Neighbours (KNN).

What are the Different Tools and Techniques used in Chemometrics?

With advancements in time, chemometrics has added multiple feathers in its cap rather than being a single tool for its application in the Chemical Sciences. A wide variety of the disciplines that contributed to the advancements of the field of Chemometrics are shown in the figure below. It has been adding multiple techniques each time to expand its applicability in the Research & Development of the chemical sciences.

  • Multivariate Statistics & Pattern Recognition in the Chemometrics

Multivariate statistical analysis refers to the concurrent analysis of multiple factors to derive the totality of the information from the data. The information derived may be the effect of individual factors, the interaction between two or more factors, and the quadratic terms of the factors. As multivariate data analysis involves estimation of almost all the possible effects in the data, these analysis techniques have very high precision and help make highly predictable conclusions. The multivariate statistical tools and techniques find plenty of applications in following industries:

  • Pharma and Life Sciences
  • Food and Beverages
  • Agriculture
  • Chemical
  • Earth & Space
  • Business Intelligence

Some of the most popular and commonly used multivariate modelling approaches are described briefly below.

  • Principal Components Analysis

Data generated in chemometrics, particularly in spectroscopic analysis, is enormous. Such datasets are highly correlated and difficult to model. For that matter, Principal Components Analysis (PCA) creates new uncorrelated variables known as principal components. PCA is a dimensionality reduction technique that enhances the interpretability of large datasets by transforming large datasets into smaller variables without losing much of the information. Let’s Excel Analytics Solutions LLP offers a simple yet highly capable web-based platform for PCA, branded as the MagicPCA.

  • Linear Discriminant Analysis

Linear discriminant analysis is another multivariate technique that is dependent on dimensionality reduction. However, in LDA the dependent variables are categorical variables and the independent variables could be in the form of intervals. The LDA focuses on establishing a function that can distinguish between different categories of the independent variables. This helps identify the sources of maximum variability in the data. Our experts at Let’s Excel Analytics Solutions LLP have developed an application, namely niceLDA, that can solve your LDA problems.

  • Partial Least Squares

Partial Least Squares (PLS)  is a multivariate statistical tool that bears some resemblance with the Principal Components Analysis. It reduces the number of variables to a smaller set of uncorrelated variables and subsequently performs linear regression on them.  However, unlike linear regression, PLS fits multiple responses in a single model. Our programmers at Let’s Excel Analytics Solutions LLP have developed a user-friendly web-based application for partial least square regression, EasyPLS.

Application of Chemometrics in Analytical Chemistry

Chemometrics finds its application throughout the entire lifecycle of the Analytical Sciences right from the method development and validation, development of the sampling procedure, exploratory data analysis, model building and, predictive analysis. The analytical data generated has a multivariate nature and depends on the multivariate data analysis (MVDA) for the exploratory analysis and predictive modeling. The three main areas of the Analytical Sciences where Chemometrics has demonstrated its advantages over the conventional techniques include:

  1. Grouping or cluster analysis refers to a group of analyses where a data set is divided into various clusters in such a way that each cluster has a unique and peculiar property that differs from another set of clusters. A widely known example of cluster analysis is used in flow cytometric analysis of cell viabilities where cells are clustered based on the apoptotic markers. Principal Component Analysis can be used as a powerful tool for understanding the grouping patterns.
  2. Classification analysis is defined as a systematic categorization of chemical compounds based on known physicochemical properties. This allows for the exploration of the alternatives for a known chemical compound with similar physicochemical properties. For example, in the development of the HPLC method for polar and aromatic compounds, data mining for the corresponding solvents can be done by looking into polar and aromatic classes of the solvents. This can be done by building SIMCA models on top of the Principal Component Analysis.
  3. Calibration of the analytical methods: chemometrics-assisted calibration of analytical methods employ multivariate calibration models where multiple, sometimes hundreds, analytes are calibrated at the same time. These multivariate calibration models have many advantages over the conventional univariate calibration models. The major advantages include:
    1. significant reduction of noise,
    2. non-selectivity of the analytical methods,
    3. dealing with interferents and,
    4. outliers can be detected and excluded in the first place.
  4. The Principal Components Analysis and Partial Least Squares are the most commonly used chemometrics tools that are used for developing multivariate calibration models in the development of analytical methods for pharmaceuticals, foods, environmental monitoring, and forensic sciences. The chemometric tools have widely transformed the discipline of the Analytical Sciences by building highly reliable and predictive calibration models, providing tools that assist in their quantitative validations, and contributing to their successful application in highly sensitive chemical analyses.

Application of Chemometrics in Studying QSAR in Medicinal Chemistry

QSAR stands for “quantitative structure and activity relationship” and refers to the application of a wide variety of computational tools and techniques used to determine the quantitative relationship between the chemical structure of a molecule and its biological activities. It is based on the principle that each chemical moiety is responsible for a certain degree of biological activity in a chemical molecule and influences the activity of other moieties in the same molecule. In other words, the similarities in the structure of two chemical molecules could correspond to their similarities in biological activities. This forms a basis for predicting the biological activities of new drug molecules in medicinal chemistry.

For QSAR modeling, certain features of a chemical molecule that can potentially affect the biological activities are referred to as molecular descriptors. These molecular descriptors are classified into five major categories and include physicochemical, constitutional, geometric, topological, and quantum chemical descriptors. The biological activities of interest in QSAR correspond to the pharmacokinetic, pharmacodynamic, and toxicological properties of the molecule. Each of the molecular descriptors is referred to as the predictor and the corresponding biological activity as the response. The predictors are then modeled into a mathematical equation using multivariate statistical tools. There are two widely accepted statistical models used for predicting the QSAR of a new molecule and include regression and classification models. The regression models used are multiple linear regression (MLR), principal components regression (PCR), and Partial Least Square regression (PLS). Let’s Excel Analytics Solutions LLP has developed user-friendly interfaces for performing all these operations.

QSAR also has extended its approaches to other fields like chromatography (Quantitative Structure and Chromatography Relationship, QSCR), toxicology (Quantitative Structure and Toxicity Relationship, QSTR), biodegradability (Quantitative Structure and Biodegradability Relationship, QSBR), electrochemistry (Quantitative Structure and Electrochemistry Relationship, QSER) and so on.

Conclusion

Chemometrics has changed the way of designing and developing chemical processes. The information obtained from chemical data has maximized the degree to which processes can be optimized. It has also contributed significantly to the development of highly sensitive and accurate analytical methods by simplifying the complex amount of data generated during the development, calibration, and validation of the analytical data. In general, chemometrics is an ever-expanding domain that is constantly diversifying its applications in a wide variety of fields.

Let’s Excel Analytics Solutions LLP has a proven track record of developing highly reliable chemometric applications that can help you make better business decisions. If you are dealing with a complex problem and looking for the right solution, schedule a free consultation now!

ISPE Pharma 4.0

Pharma 4.0: ISPE’s Vision for Operating Model

Introduction


ISPE stands for International Society for Pharmaceutical Engineering, founded by a group of experts to discuss new challenges faced in pharmaceutical manufacturing. ISPE is a non-profit organization that provides technical and non-technical leadership for managing the life cycle of pharmaceutical products. In 2017, SIG (Special Information Group) was appointed to create a roadmap to facilitate “Industry 4.0” for pharmaceutical manufacturing. The prime objective of SIG was to reinvent Industry 4.0 for the adoption and leverage into the Pharmaceutical Industry. ISPE “Pharma 4.0” is based majorly on similar concepts and ideologies as that of Industry 4.0, it additionally has regulatory aspects based on  ICH guidelines, specifically ICH Q8 and Q10.

History of Industry X.0

Industry 1.0: The First Industrial Revolution, began in the 18th century with the utilization of machines to produce goods and the use of steam power, particularly in the weaving industry.  The mechanization of industries improved human productivity in many folds.

Industry 2.0: The Second Industrial Revolution started in the 19th Century, with the discovery of electricity. During these times, the concept of production and assembly line was introduced, by Henry Ford. The production line eased and increased the efficiency of manufacturing the automobiles, in turn reducing the production cost.

Industry 3.0: The Third Industrial Revolution started in the 20th Century, with the introduction of computers and their utilization to program the Industrial Process under human supervision.

Industry 4.0: The Fourth Industrial Revolution, which is currently ongoing. This revolution has enabled the complete automation of the industrial processes, by making the use of advanced computers and their integration into the network system, which allow internetworking communications of the production systems leading to the emergence of smart factories.

Smart Factories: The various components involved in the smart factories communicate with each other and mark the inception of total automation. These components are known as Cyber-Physical Systems that employ advanced control systems operated using softwares capable of internet connectivity {Internet of Things and Internet of Systems}, cloud computing and cognitive computing. The efficient communications and availability of information have enabled the digitization of manufacturing systems.

The Germans were the firsts to adopt the Fourth Industrial Revolution, named it I 4.0 when they initiated the projects that promoted the digitization of Manufacturing Systems.

Barriers of Industry 4.0 into Pharmaceutical Industry

It’s very right to say that the pharmaceutical manufacturing industry is not keeping up the pace with the advancing technologies. It is attributable to the stringent regulatory requirements that have slowed down the implementation process. For regulatory agencies, compliance with the existing standards matters more than the adoption of new technologies. It is believed that the pharmaceutical industry is highly regulated, and it can’t be left to machines. But the industry has started to realize the benefits of advanced technologies that can enhance productivity and improve quality at the same time. This hints at the inception of automation in achieving regulatory compliance in pharmaceutical manufacturing.

Evolution of Industry 4.0 to Pharma 4.0

  • Very often, Pharmaceutical organizations experience quality shortcomings that eventually lead to 483 observations and warning letters from regulatory agencies. Every year, approximately 4500 drugs are recalled alone in the USA. This recalling costs a great deal to the organizations.
  • Currently, the pharmaceutical industry is trying to adopt new strategies that can mitigate quality-related incidents. Lean Six Sigma tools are employed to improve product quality in pharmaceutical manufacturing.
  • In 2004, the US FDA published a guidance document entitled “Quality Systems Approach to Pharmaceutical Current Good Manufacturing Practices Regulations” that insisted manufacturers implement modern quality systems and risk-based approaches to meet the expectations of the regulatory agencies.
  • In 2009, ICH Q8 guidelines were revised to incorporate the principles of “Quality by Design”(QbD); it stated that the quality cannot be just monitored but should be built into the product. Despite these measures, quality violations of pharmaceutical products continue to be unabated.
  • The best solution to these problems is the digitalization of platforms. What is required is, the model for the implementation of digitization to the operations. ISPE has pioneered to restructure Industry 4.0 to fit the Pharmaceutical Industry, which is now known as ISPE Pharma 4.0 Operating Model.

Pharma 4.0 Operating Model

Framework of ISPE Pharma 4.0 Operating Model

Enablers

Pharma 4.0 enablers

  • Digital maturity
  • Data integrity by design

ICH derived enablers

  • Knowledge management and risk management

Elements

Pharma 4.0 elements

  • Resources
  • Information systems
  • Organization and processes
  • Culture

The above table depicts the basic structure and framework of the ISPE Pharma 4.0 Operating Model, which consists of two broad components:

  • Enablers
  • Elements

ICH defined Enablers: Knowledge Management and Risk Management

ICH defines knowledge management as a systematic approach to acquiring, analyzing, storing, and disseminating information related to products, manufacturing processes, and components.

The different sources of information include:

  • Product design and development
  • Technology transfer
  • Commercial manufacturing, etc.

The knowledge management of the product and product-related process needs to managed right from the product development through commercial manufacturing up to product discontinuation. It has to be digitalized in the form of  databases and should be connected directly to the raw data sources, which will ensure the data integrity of all GxP and non-GxP data, that  helps in making better choices and build regulatory confidence.

Various In-line, At-line, and On-line tools as used for :

  • Analysis of raw materials.
  • In-process monitoring
  • Final product analysis

These tools can be directly integrated into database systems for real-time data management.

ICH Q9 (Quality Risk Management), also known as the ICH Q9 model, is a fundamental guideline that describes the potential risks to quality that can be identified, analyzed and evaluated.

This guideline is supported by ICH Q10 (Pharmaceutical Quality Systems) which describes a model for an effective quality management system.

The ICH Q10 implementation has three main objectives:

  1. Attain Product Realisation
  2. Develop and Maintain a state of process control.
  3. Ensure continuous improvement.

ICH Q10 provides guidelines regarding critical quality attributes (CQAs) that should be within a specific range to ensure desired product quality. The variables, process parameters, and material attributes that affect the critical quality attributes are referred to as Critical Process Parameters (CPPs) and Critical Material Attributes (CMAs) respectively.

ICH Q12 appends on ICH Q10 to include those parameters which are not critical to quality but are responsible for the overall performance of the product. These attributes are known as Key Process Indicators (KPIs) and continuous efforts should be made to bring the KPIs under six sigma control.

Any excursions or changes in the CQAs, CPPs, CMAs, and KPIs should be communicated to the respective regulatory authorities; prior approval is required in certain cases before the implementation of the changes.

Pharma 4.0 Enablers: Digital Maturity and Data Integrity by Design

The first enabler in Pharma 4.0 to make an organization a smart factory is, Digital Maturity. It specifies the ability and the path of implementation of Pharma 4.0 for an organization. The model is developed in a way such that, an organization can perform gap assessment in terms of its position in digital maturity, improvisations in its capabilities, and based on what future capabilities would be. The basic requirement to achieve digital maturity is computerization and interconnectivity across all the quadrants of the operating models. After fulling these requirements, the organization can move towards advancement by capabilities like data visibility, predictive capacity, and adaptability.

  • Data visibility: A strategy where an organization can acquire, display, monitor, and analyze the data generated across all the sources in the organization.
  • Data Transparency: The ability to access the data no matter what generated it and where it is located.
  • Data Predictability and Adaptability: The ability of the data to predict future outcomes and improve on the predictability as more data is added to enhance the accuracy of the predictions.

These functions of the data help an organization to make a statistically calculated decision as they are based on real-time data.

ICH Q6 (Good Clinical Practices) defines data integrity as the extent to which data is complete, consistent, accurate, trustworthy, and reliable throughout the data lifecycle. The regulatory approval of the drug and all the related process are dependent on the quality and integrity of the submitted data. In the year 2016, USFDA issued a guideline, entitled “Data Integrity and Compliance with Drug cGMP”  that focuses on developing effective strategies for data integrity throughout the life of the drug product.

These strategies should be bases on quantitative risk assessments for patient safety.  Moreover, data integrity should be built into the products and related processes during the design and development; this could be done by introducing digitalization of data integrity known as ‘Data Integrity by Design’. When digitalization will be introduced, every process will have a defined workflow to avoid any silos of information and data integrity relates issues.

Pharma 4.0 Elements:

Resources:

Resources of an organisation refer to the physical and intangible assets owned by an organization, majorly categorized into:

  • Human Resources
  • Machines
  • Products

The Machines employed in Pharma 4.0 should be highly advanced and developed based on Artificial Intelligence and Machine Learning. They would be highly automated and adaptive to the ever-changing business needs of the organization. These machines can be connected to PAT tools for in-line, online, and at-line monitoring during the manufacturing of the products. Such capabilities enable machines in taking their own decisions. But to run these machines, a new generation of highly skilled people is required, these people would be called Workforce 4.0. The success of Pharma 4.0 would largely be dependent on the engagement and continuous upskilling of Workforce 4.0 and the choice of Artificial Intelligence and Machine Learning Platform.

Information

The information system is an integrated set of components for collecting, storing, and processing data and for providing information, knowledge, and digital products. By this means the components relate to each other. This integration forms a basis for:

  • How data is interfaced
  • How processes are Automated
  • How processes have the power for predictive analysis.

The predictive analysis enables the real-time release testing of the products known as “ ad hoc reporting”, which is already being used by some organizations.

The other benefit of integration into information systems is the preventive maintenance of equipment. The equipment takes ownership of its maintenance by analyzing daily data and let the potential maintenance activities be known in the first place and in some cases rectify the abnormalities, this reduces the equipment breakdown time significantly, thus increasing overall productivity. There is more potential area of integration into the information system, but they should adhere to global standards like GAMP5, ISO, etc.

Organization and Processes

An organizational structure needs to be developed which builds processes for substantiating prospective business challenges. Pharma 4.0 is a huge task for the organization and its outcomes are also uncertain, hence a sound and step-by-step organizational structure is required to be developed. The Organisational process needs to be developed across all elements of the holistic control strategy, such that each element functions collaboratively.

Culture

Culture refers to the shared beliefs and values of an organization that help achieve common organizational goals successfully. It should promote collaborative contributions as collaborations drive innovations. A culture where people understand the importance of each Pharma 4.0 element and which percolates down to each stage in the product lifecycle, from the early development to technology transfer and commercial manufacturing, should be developed.  New collaborations should be sought every time to improve on the existing capabilities and acquiring new capabilities. People should be encouraged to adapt to the new changes as upgradation is the requirement of sustenance in the ever-changing market.

Existing Control Strategy vs holistic Control Strategy

  • The existing control strategy was once a game change, which improved quality oversights in the manufacturing, however, to note it just reports quality, i.e, it can tell what has gone wrong, but it cannot predict when and what can go wrong. It puts process control by continuous monitoring of manufacturing processes for the process-related excursions.
  • The Holistic Control Strategy as described by ISPE is based on ICH and Pharma 4.0 enablers and elements that provide control over the production process to ensure a flexible, agile, sustainable, and reliable manufacturing system with lower risks to patients, processes, and products. However, its success depends on the mutual consensus between industry and regulatory agencies.

Barriers to Pharma 4.0

Even though the Pharma 4.0 model might initiate a new era of smart pharmaceutical manufacturing, there are several barriers to the adoption of this model.

 The main barriers involved are:

  • High cost of digitization
  • Time-consuming
  • Skilled and trained workforce
  • Uncertainty of the Outcomes

Despite all these barriers particularly the cost factor, Pharma 4.0 is going to be a reality and the desperate business need for sustainability. At Let’s Excel Analytics Solutions LLP we have developed cloud-based platform technologies that drastically cut down on digitalization costs. Hence, the barriers will be quickly offset by the tremendous increase in productivity and significant reductions in downtimes.

Summary

Pharma 4.0 digitalization is an imperative and inevitable transition that Pharmaceutical Industry is undergoing. To support the smooth transition to Pharma 4.0.

Curious to know about our automation accelerating machine learning platform ?