Smarter Part Production with Simulation and Nesting
Silicon is one of the most common elements on the planet, but it exists almost always as a compound, such as silicon dioxide or common sand. In order to be useful in the semiconductor industry, silicon has to be purified, in a process that involves heating silicon tetrachloride to remove impurities and then pulling (or growing) a single silicon crystal.
In the 1950s, that was no easy feat. You couldn’t just order a crystal puller from your friendly wafer-fabrication equipment maker—it all had to be invented and carefully engineered. In those days, there was little distinction between R&D and manufacturing, as early practitioners in the field designed and built the equipment to be put into use on the shop floor. One of the pioneers was Robert E. Lorenzini.
Read more about Mr. Lorenzini and the Secondhand Origins of Silicon Valley’s Ingot Industry at IEEE Spectrum.
Visual Inspection
Understanding Cost and Feasibility of 3D Printing - Ask an Additive Expert
Acoustic Monitoring
Also, Schneider Electric’s Industrial Insights podcast is now live.
Assembly Line
Hyperspectral imaging aids precision farming
Remote sensing techniques have exponentially evolved thanks to technological progress with the spread of multispectral cameras. Hyperspectral imaging is the capture and processing of an image at a very high number of wavelengths. While multispectral imaging can evaluate the process with three or four colors (red, green, blue and near infrared), hyperspectral imaging splits the image into tens or hundreds of colors. By using the technique of spectroscopy, which is used to identify materials based on how light behaves when it hits a subject, hyperspectral imaging obtains more spectra of data for each pixel in the image of a scene.
Unlike radiography, hyperspectral imaging is a non-destructive, non-contact technology that can be used without damaging the object being analyzed. For example, a drone with a hyperspectral camera can detect plant diseases, weeds, soil erosion problems, and can also estimate crop yields.
Influence estimation for generative adversarial networks
Expanding applications [1, 2] of generative adversarial networks (GANs) makes improving the generative performance of models increasingly crucial. An effective approach to improve machine learning models is to identify training instances that “harm” the model’s performance. Recent studies [3, 4] replaced traditional manual screening of a dataset with “influence estimation.” They evaluated the harmfulness of a training instance based on how the performance is expected to change when the instance is removed from the dataset. An example of a harmful instance is a wrongly labeled instance (e.g., a “dog” image labeled as a “cat”). Influence estimation judges this “cat labeled dog image” as a harmful instance when the removal of “cat labeled dog image” is predicted to improve the performance (Figure 1)
Innovation Fuels Stanley Black & Decker's Transformation
With more than 100 manufacturing plants globally, the 178-year-old Stanley Black & Decker (SBD) has entrenched itself one of the world’s most recognizable and innovative brands.
A key component of the company’s staying power? The company has stayed on a clear journey of continuous improvement with dedication to innovation that includes regularly applying advanced technologies across the company’s operation, ultimately resulting in a culture dedicated to seeking “game changing solutions” that consistently yields an impressive number of new products and world firsts each year.
Complex machine validations performed with multiphysics simulation
When new materials and methods are applied to manufacturing, it increases product complexity. But the benefits can be significant: Products are now lighter, smaller and more easily customizable to meet consumer demands. Multiphysics simulations enable machine builders to explore the physical interactions complex products encounter, virtually. It tracks interactive data of product performance, safety and longevity.
FPGA comes back into its own as edge computing and AI catch fire
The niche of edge computing burdens devices with the need for extremely low power operation, tight form factors, agility in the face of changing data sets, and the ability to evolve with changing AI capabilities via remote upgradeability — all at a reasonable price point. This is, in fact, the natural domain of the FPGA with an inherent excellence in accelerating compute-intensive tasks in a flexible, hardware-customizable platform. However, much of the available off-the-shelf FPGAs are geared toward data center applications in which power and cost profiles justify the bloat in FPGA technologies.
Surge Demand
NASA pushes the boundaries of additive manufacturing with a rocket engine hardware. Microsoft is re-inventing railways with cloud computing in Europe.