codedata
codedata

78% of C-level executives do not have a real-time view of inventory across their supply chain channels, with 50% of them believing they do not have the right platforms or technology in place to support expanded fulfillment options. Which one are you?

 

This is the finding from a survey of more than 221 global C-level executives on their supply chain visibility jointly undertaken by JDA, Microsoft and incisiv. For details refer to the article link at the bottom of this post.

 

#SupplyChainVisibility does not mean a larger number of reports and dashboards and excel spreadsheets. It does not mean how efficiently you share these reports and dashboards through your business units and manufacturing operations. Supply Chain visibility should mean more than that – it ought to be indication of what percent of your decision making is driven by data gathered across the supply chain.

 

End-to-end supply chain visibility is hard and it gets complex with more SKUs, more locations, larger geography spread and so on. but it is important to take the first step. What makes it hard firstly is that data must be strung together from tens of disparate systems, often hundreds, from planning to distribution. No single system in the chain gives enough visibility so supply chain managers can drive better performance. And secondly, though they have the ability to harness data from the systems, most companies that we have encountered lack the analytical capabilities required to analyze the data and derive meaningful recommendations to improve decision making, This results is data accumulation without intelligence.

 

One of our manufacturing supply chain prospects wanted to build a smart supply chain through meaningful use of data and learning technologies. Their initial requirement was for us to build them a stable and performant data pipeline first,  and once the pipeline was  able to acquire data across all of their enterprise systems, we would build the learning capabilities in phase 2, starting with improved demand forecasting and continuing to other areas of the supply chain.

 

But we proposed (and finally implemented) a different process.  We arrived at the same outcome, but followed a slightly more #agile path. We started on the data pipeline, but ingesting data from only 3 of their 23 systems – ones that were identified to have most impact on the demand forecasting process. We then added the 2 external sources of data that they wanted included. Leveraging #CODAMind we were able to build a data pipeline within a few weeks. We then implemented an initial version of machine learning for demand forecasting.  The solution gave them the ability to include external factors like competitor pricing and weather in their #DemandForecasting process and allowed them to forecast further into the future. We did a few more iterations on the data sources and #MachineLearning algorithms and ended up with a  decent learning system that improved their demand forecasting. We then moved on to other parts of the supply chain, making an initial pass at each one of them in terms of data acquisition and learning.

 

As the next phase of the implementation we are taking another pass at the pipeline – adding more data sources and improving the overall accuracy of our learning algorithms. The final goal is to leverage the power of #AI to make the supply chain smart enough so actions can be taken to improve planning, production efficiency, and customer experience. For example, instead of providing a visual experience in terms of reports and dashboards, provide a set of recommendations accompanied by what-if’s to gauge the impact of the recommendations. That will demonstrate the real power of AI in the#SmartSupplyChain evolution.

 

Today’s post is inspired by:  https://www.supplychaindigital.com/technology/c-level-retail-executives-lack-visibility-supply-chain-report-finds

 

codedata

Total QA is seriously interesting and until last year only talked about. Now with advancement in Autonomous Machine Vision, Total QA is not just a concept – it can be realized by manufacturers themselves, and it can be implemented and realized value from very quickly.

 

Total QA is realization of a concept in which quality control is not limited to specific steps in the production line, but can be implemented through every step along the production line and using the same QA systems.

 

Pretty awesome, huh?

 

Advancements in Machine Learning, Computer Vision and Optics have made Autonomous Machine Vision and hence Total QA possible. Based on what I am reading, the AMV’s not only capture images of the manufacturers’ products, they also use the power of AI to self-configure themselves to the production environment they are in. That in itself is powerful – less configuration reduces complexity and lead time to set up. Moreover, in times of performance irregularities like a bad lens or bad sensor these systems are capable of performing self-diagnosis to identify the issue, significantly reducing their own mean time to repair (MTTR). This means less downtime of manufacturers’ QA systems, also avoiding costly repairs and human intervention.

 

I am yet to see a real one in action, but with AMV’s driving Total QA, manufacturers will finally benefit from powerful visual QA systems with very short implementation lead times and rapid diagnostics. Not to mention the impact the stream of continuously generated product quality data will have on the manufacturers’ #smartmanufacturing and #industry4.0 initiatives. 

 

We at #codedataio celebrate #ai powering the next wave of manufacturing and we are excited about the impact #autonomousmachinevision will have on reducing product recalls and continuously  improving product quality for manufacturers.

codedata

Gartner’s research predicted that people increasingly rely on the outcome of AI solutions. By 2022, 30% of consumers in mature markets will rely on AI to decide what they eat, what they wear or where they live, said the analyst firm.

 

Relying on AI to tell me where I should live may be ok, because I have seen enough data points to validate the “truth” in AI based recommendations. But in so many other areas it may be hard to accept the truth proposed by AI. And that precisely is what makes AI a hard sell.

 

Why would you trust the recommendations made by an AI engine, more importantly when it is a black box.Prospects and customers at #codedataio have voiced similar concerns. They are interested in making the leap, but the AI black box makes it hard for them to rely on our recommendations. The “trust factor” will get better over time for sure, as they have more data points to validate the impact of our recommendations to their businesses.

 

That’s where Explainable AI comes in. We are following the how the space evolves and will add tools as they become available to include XAI as part of #codaai but until then we allow your data scientists to dive down into the data that makes the recommendations.

 

To read more on #explainableai read on.

 

https://www.computerweekly.com/news/252457364/Explainable-AI-How-and-why-did-the-AI-say-true