• pagebanner

News

With the addition of computers, the laser cutting machine quickly became a relatively simple and powerful tool, with its software controlling shiny machinery that can cut metal, wood, paper, and plastic. Although this strange mixture of materials feels all-encompassing, users still face the difficulty of distinguishing the inventory of visually similar materials. The wrong material can cause a slimy mess, give off a terrible smell, or worse, spray. Out of harmful chemicals.
Scientists at the Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory (CSAIL) proposed “SensiCut”, a smart material sensing platform for laser cutting machines, for problems that may not be fully clear to the naked eye. Compared to traditional, camera-based methods that can easily misidentify materials, SensiCut uses a more subtle fusion. It uses deep learning and an optical method called “speckle sensing” to identify materials. This technology uses lasers to sense the microstructure of the surface and can be achieved with only an image sensor plug-in.
A little help from SensiCut can be of great help-it can potentially protect users from hazardous waste, provide material-specific knowledge, suggest subtle cut adjustments for better results, and can even engrave from multiple materials Composition of various items such as clothing or mobile phone cases.
“By using lensless image sensors to enhance standard laser cutting machines, we can easily identify common visually similar materials in the workshop and reduce overall waste,” said Mustafa Doga Dogan, a CSAIL PhD candidate at MIT. “We do this by taking advantage of the micron-scale surface structure of the material, which is a unique characteristic even when it is visually similar to another type. Otherwise, you may have to compare the correct material from a large database. Make an educated guess about the name.”
In addition to using the camera, sticker tags (such as QR codes) are also used on a single sheet of paper to identify them. However, this seems very simple, but during the laser cutting process, if the code is cut from the main table, it cannot be recognized for future use. In addition, if the incorrect label is applied, the laser cutter will use the wrong material type.
In order to successfully play a round of “What material is this”, the team trained SensiCut’s deep neural network on more than 38,000 images of 30 different material types. Then it can distinguish between acrylic, foam board and styrene. Something like that, and even provide further guidance on power and speed settings.
In an experiment, the team decided to build a mask, which needed to distinguish transparent materials from the workshop. The user first selects a design file in the interface, and then uses the “precise positioning” function to move the laser to identify the material type at a certain point on the sheet. The laser interacts with the tiny features of the surface, and the light reflects off the surface, reaches the pixels of the image sensor and produces a unique two-dimensional image. The system can then warn or mark the user that their sheet is polycarbonate, which means that if it is cut by a laser, it may produce a highly toxic flame.
The laser cutting machine uses spot imaging technology inside, with low-cost, off-the-shelf components, such as the Raspberry Pi Zero microprocessor board. To make it compact, the team designed and 3D printed a lightweight mechanical housing.
In addition to laser cutting machines, the team also envisioned a future in which SensiCut’s sensing technology could eventually be integrated into other manufacturing tools such as 3D printers. In order to capture more nuances, they also plan to expand the system by adding thickness detection, which is a related variable in material composition.
Dogan co-authored this paper with undergraduate researchers Steven Acevedo Colon and Varnika Sinha in the Department of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, Kaan Akşit, an associate professor at University College London, and Stefanie Mueller, a professor at MIT.
The team will present their work at the ACM User Interface Software and Technology Workshop (UIST) in October. This work was supported by the NSF Award 1716413, the MIT Portugal Program, and the MIT MathWorks Seed Funding Program for Mechanical Engineering.


Post time: Oct-25-2021