How AI Thinks Industry 4.0 Will Play Out
Today’s post is created by an AI model from copy.ai. I prompted the model to generate a post based off the following bullet points and the title of this post.
- AI is being deployed into industry for predicting maintenance actions and identifying visual defects to reduce manufacturer costs and improve product quality.
- New AI techniques such as reinforcement learning have been successful in other domains, like gaming, but have yet to truly impact manufacturing.
- Robotics continues to advance at a rapid pace to automate more complex and tedious tasks.
Here’s what it came up with (completely unedited):
There are a couple of forces driving the development of 4.0. Industry 4.0 was designed to provide industry operators with more data, faster access to it, and automated processing of that data. The use of big data coupled with AI scanners and better microprocessors is enabling technologies such as molecular manufacturing at scale. Industry 4.0 can also be seen as an effort to unify different technologies within one system. As technology continues to advance in various areas, general purpose computing devices will become smaller, more affordable, and in many cases capable of out-performing specialized computers….
My prediction for 2021 will be that AI will be used at an increasing scale, but not necessarily in every industry. Instead, I predict that AI will be used to boost efficiency and lower costs but not necessarily replace human workers. In 2021, AI will mostly be used for adding value to inputs and outputs, rather than doing all the work itself. The value added by an automated system will depend on how well it matches human needs (and vice versa). Ultimately it will be a case of “will they or won’t they?”
First Person On the Ground (FPOG) technology is getting better - memorizing skills like driving or recognizing objects are naturally transferable between hands and machines. There will be no end to the degree of roboticization that machine learning will enable.
I was a bit disappointed by these results given the bullets fed to the model. The model probably needs to read Exponential Industry to get up to speed. Overall, I think it’s safe to say there’s still a lot of work to do before machines are able to write coherent assembly instructions, maintenance manuals, or production schedules.
Assembly Line
Go beyond machine learning to optimize manufacturing operations
Machine learning depends on vast amounts of data to make inferences. However, sometimes the amount of data needed by machine-learning algorithms is simply not available. SRI International has developed a system called Deep Adaptive Semantic Logic (DASL) that uses adaptive semantic reasoning to fill in the data gaps. DASL integrates bottom-up data-driven modeling with top-down theoretical reasoning in a symbiotic union of innovative machine learning and knowledge guided inference. The system brings experts and data together to make better, more informed decisions.
Harvesting AI: Startup’s Weed Recognition for Herbicides Grows Yield for Farmers
In 2016, the former dorm-mates at École Nationale Supérieure d’Arts et Métiers, in Paris, founded Bilberry. The company today develops weed recognition powered by the NVIDIA Jetson edge AI platform for precision application of herbicides at corn and wheat farms, offering as much as a 92 percent reduction in herbicide usage.
Driven by advances in AI and pressures on farmers to reduce their use of herbicides, weed recognition is starting to see its day in the sun.
Amazon’s robot arms break ground in safety, technology
Robin, one of the most complex stationary robot arm systems Amazon has ever built, brings many core technologies to new levels and acts as a glimpse into the possibilities of combining vision, package manipulation and machine learning, said Will Harris, principal product manager of the Robin program.
Those technologies can be seen when Robin goes to work. As soft mailers and boxes move down the conveyor line, Robin must break the jumble down into individual items. This is called image segmentation. People do it automatically, but for a long time, robots only saw a solid blob of pixels.
Classify This Robot-Woven Sneaker With 3D-Printed Soles as 'Footware'
For athletes trying to run fast, the proper shoe can be essential to achieving peak performance. For athletes trying to run as fast as humanly possible, a runner’s shoe can also become a work of individually customized engineering.
This is why Adidas has married 3D printing with robotic automation in a mass-market footwear project it’s called Futurecraft.Strung, expected to be available for purchase as soon as later this year. Using a customized, 3D-printed sole, a Futurecraft.Strung manufacturing robot can place some 2,000 threads from up to 10 different sneaker yarns in one upper section of the shoe.
AI In Inspection, Metrology, And Test
“The human eye can see things that no amount of machine learning can,” said Subodh Kulkarni, CEO of CyberOptics. “That’s where some of the sophistication is starting to happen now. Our current systems use a primitive kind of AI technology. Once you look at the image, you can see a problem. And our AI machine doesn’t see that. But then you go to the deep learning kind of algorithms, where you have very serious Ph.D.-level people programming one algorithm for a week, and they can detect all those things. But it takes them a week to program those things, which today is not practical.”
That’s beginning to change. “We’re seeing faster deep-learning algorithms that can be more easily programmed,” Kulkarni said. “But the defects also are getting harder to catch by a machine, so there is still a gap. The biggest bang for the buck is not going to come from improving cameras or projectors or any of the equipment that we use to generate optical images. It’s going to be interpreting optical images.”
How to use simulation as a network optimization tool
Imagine watching a live football match in a crowded stadium which causes an expected surge in network traffic. This surge is usually handled by adding a base station. Thanks to advancements in simulators to support multi-user and multi cell coverage as, it can now be used to replicate certain aspects of a real networks. In this blog post we reveal how a radio network simulator can be used to simulate these scenarios and decide on an ideal location for base station placement.
Tractor Maker John Deere Using AI on Assembly Lines to Discover and Fix Hidden Defective Welds
John Deere performs gas metal arc welding at 52 factories where its machines are built around the world, and it has proven difficult to find defects in automated welds using manual inspections, according to the company.
That’s where the successful pilot program between Intel and John Deere has been making a difference, using AI and computer vision from Intel to “see” welding issues and get things back on track to keep John Deere’s pilot assembly line humming along.
Surge Demand
To help deal with infectious diseases the robots are here to help. The robots are also here to learn about your tacit knowledge of that legacy manufacturing line. Bioprinting is beginning to “to put roughly the right cells in roughly the right places”. CNC machines are getting smarter to automatically assess tool wear and adjust accordingly.