Executive Briefing: AI Impact on Industrial Workplace Safety

Back

Feb. 19, 2021

The power of computer vision is poised to have a positive impact on the $3 trillion workers compensation space. AI models can help prevent 85% of injuries in the workplace – slip & fall, manual materials handling, tool accidents and struck-by’s. In this executive briefing, ADLINK, StrongArm Technologies and Chooch AI will discuss how the integration of their three technologies enables enterprises to detect the lack of safety gear and dangerous conditions and alert workers and management on a real time basis. Hear from leaders not only about the impact on costs and worker safety, but also the practical solution available now for the workplace.


Jeffrey Goldsmith:

I’d like to welcome everyone to this executive briefing around workplace safety and AI. And on the webinar, we have Michael Liou from Chooch AI. Sean Petterson from StrongArm technologies and Keith Robinson from ADLINK.

Michael Liou:

All right. Terrific. Thank you, Jeffrey. And it’s great to be here with my colleagues, Keith and Sean. And what we’d like to do in the next 30 minutes is talk about industrial safety or EHS, environmental health and safety. So I’m going to start off here by sharing screen. So Chooch AI for those who don’t know us is a visual artificial intelligence company based in Silicon Valley. And we’ve developed cutting Edge technology that detects objects, images, faces, actions, and even temporal conditions. We can generate predictions at both in the cloud and in the Edge. And we’re a horizontal already now platform, meaning that we’re somewhat industry agnostic. And to date, we’ve developed a solutions in areas like industrial safety, which we’re discussing today, infrastructure monitoring, safety and security healthcare. We’ve done some stuff in geospatial as well as in the retail space.

Michael Liou:

Now the power behind the platform is derived from integrating dozens and dozens of steps across three key AI disciplines in the visual AI space. One is a data set generation where the team has built automated tools, such as machine labeling, video annotation, and even uses data augmentation to actually take data and rapidly annotate it and where these tools help our annotators generate over 20,000 images per day. The second discipline is the meat of it, the model training and development, and the team has built an internal no-code platform that will automatically select a deep learning framework or frameworks, the epics layers thresholds, and do validation and testing and often can generate with these datasets of base models in a day. So not weeks, not months, but they’re very, very rapid iterations. And then lastly the cloud and Edge inferencing where we’re able to generate fast and accurate predictions, we can run multiple models simultaneously.

Michael Liou:

We can generate calls to action as well as help us share that data. And then we also have deployed out to the Edge as well too via video camera and obviously GPU’s. AGI is very, very topical right now, or seeing a human tremendous amount of demand from our partners, OEMs system integrators chip manufacturers, software developers, and obviously with the advent of lightweight high compute power developed by Nvidia and their line of GPU’s, we’re able to put these very, very small devices on-premise and co-located with the cameras. And essentially turning dumb cameras in smart notes. We also have the ability to generate what we call very fast Edge inferencing, meaning that we’re able to generate predictions often under 20 milliseconds upon detection. As many of you know Edge confers a lot of advantages compared to cloud computing, which obviously is very, very scalable.

Michael Liou:

One is that if you need to stream 24 seven mission critical video to the cloud, that can get very, very expensive. If you’re sharing the occasional 10 second clip or image, no big deal, but in critical situations where trying to detect fire, or you’re trying to ensure that people wearing the right safety gear to prevent accidents. This is quite critical. We also need Edge computing in environments where the network might not be very reliable or the throughput or bandwidth may not be that high. And in some instances like oil fields, where there is no cell service or wifi we’re in calm denied environments. And therefore we have this need for additional Edge computing. And then lastly when you do move models from the cloud to a smaller devices, there is a small degree of model optimization that also occurs, especially if you’re running multiple models.

Michael Liou:

And the team here has created an optimization technique that basically allows us to do this. Also deploy multiple models on Edge devices without sacrificing accuracy, without sacrificing speed. By way of example, I think for the AGX, GPU that Nvidia makes we’re able to handle about five models per stream in about five streams up per device. All this can be handled and managed on the Chooch AI dashboard. This helps provision and allocate resources where you can manage all your datasets, all your models, either customized or pre-trained. We have the ability to register all the GPUs onsite near Edge devices, registered cameras to each those GPU’s and then select multiple models, all deployed from the side dashboard. Once inferences are generated, we can generate a call to action by a text or email. And of course also Auspost either save that metadata or the client can save it in their cloud or on-premise for additional analysis. And then lastly, we have the ability to also update these Edge models automatically.

Michael Liou:

So why don’t we go to safety? It goes out saying that unfortunately, there are still way too many accidents that happen out there. 2018 about two and a half million people sustained workplace injuries and had to go to the ER. Almost a quarter million people had to miss work due to injuries, just from contact with objects and equipment, and through the most common workplace injuries that required ER treatment have to do with that being struck by an object or equipment overexertion like pulling your back out as an example and slips and falls.

Michael Liou:

And there are other things that we should be doing better in this field. There’s a lot of different standards that were violated back in 2019. And what if we actually use AI and to ensure that the standards were adhered to and thereby reducing risk, reducing injury, reducing lawsuits, maintain the productivity. And all of this actually has impact on a bottom line, which might colleague Sean Peterson, we’ll get to in a minute.

Michael Liou:

So we have built a number of different pre-trained models here at Chooch AI. One is industrial PPE package that enables models to detect hard hats, safety vests, gloves, goggles. Obviously if we can ensure that people are complying and wearing the appropriate safety gear, we could certainly lower the risk of getting struck by a moving object, reduce a head injury as well as reducing potential eye injuries as well too. We also can build models to monitor ladders, scaffolding make sure you’re wearing this correct safety harness out while you’re in the elevated position warning people if they’re looking under suspended loads and even ensuring that people who are working are not using their mobile phones. We’ve also developed models for fire detection, smoke detection, as well as fall detection.

Michael Liou:

These can be applied to many different public areas, such as malls obviously retail, warehouses, nursing homes. Interestingly, the current state of technology that detects fires and smokes requires the smoke and fire to actually get big enough. That’s actually big enough to actually sets off the sensor up in the ceiling and in a warehouse where you have 30 or 40 foot ceilings, there’s going to be a pretty big fire before the sensors go off. So envision having cameras in the warehouse detecting things when they’re actually quite small and thereby alerting management, alerting 911, and actually distinguishing us sooner rather than later.

Michael Liou:

For slip and fall, often these don’t result in a lot of big deal, but what happens if you’re unconscious? What happens if you go for help, but nobody’s around because you’re working the late shift or perhaps you’re working and other people around you, but they can’t hear because they’re all wearing industrial ear plugs. These are some of the use cases where a visual AI can actually help and notify management that someone’s been injured and help again, sooner rather than later. We’re part of a larger ecosystem here. These are just some of our great partners with Nvidia visual AI space. That includes obviously ADLINK and StrongArm. And with that, I would like to hand it off to the CEO and founder of Syrma technology. Sean Peterson. Thank you.

Sean Petterson:

Thanks Michael. Everyone, it is absolutely a pleasure to be here. My name is Sean Peterson. I am the founder and CEO of StrongArm tech and we’re an industrial safety science company. And the reason for our business is to protect those blue collar workers that we’re talking about. We like to call them industrial athletes. They’re the individuals that are working inside of the warehouses. They’re stocking the shelves at your grocery store. They are building the cars on our roads, paving the roads and really keeping us afloat. And what we do is we keep them proud, protected, and productive, and we help them avoid some of those injuries we just discussed. $3 trillion a year spent. And that’s just the start of the challenges. We’re dealing with employee attrition. Some of our clients have 200% turnover rates and hiring in this environment is incredibly challenging.

Sean Petterson:

In addition, the new normal COVID, the load that’s put on EHS professionals is just one more thing to balance. And it’s our goal to use technology to help avoid these injuries. Is because these injuries happen fast. You may be surprised, but during this chat alone, 38,000 people are going to go hurt or die just by doing their job. So they happen fast. We need to be faster. And this combination of our wearable technology with machine vision enables us to get to that point. To the point where we’re finding the challenge, which is the lack of information from the most sensitive integral part of the industry, 4.0. right now, everything in the warehouse is connected except for the human being. So we find is that embedding the insights based on safety, based on the health and wellbeing of that individual. We’re now able to find what we see as the golden triangle. And inextricable link between safety productivity inefficiency, and we help you get to that optimized workforce using our platform.

Sean Petterson:

Our platform is relatively straightforward. We provide wearables that are prescribed for the type of job that individual is working. We deliver insights, haptic feedback in alerts, and we eliminate the injury right at its cost. We then take that data and we provide it to the managers, the EHS professionals who are using that data to come up with a continuous plan for improvement, not just managing by looking for people and coaching them in real time. We’re using actual objective data to go down and actually have meaningful coaching moments. We then take that data a step further and we’re driving it into operations. And what we’re doing here is unifying all folks on the same front of safety. What we’re doing is aligning incentives for operations, industrial athletes, and the business overall for productive use of safety score information. The type of information that we’re collecting is based on where the environments we sit.

Sean Petterson:

But some of the most common factors that we’re finding first and foremost are going to be ergonomic challenges. So we’re going to measure all of the external movements, provide haptic feedback and eliminate ergonomic challenges. We’re also going to provide proximity information. This is great for forklift cavort. It’s tagging, tag out, lock in, lock out zones. And now we’re providing insights for social distancing, contract tracing. We’re able to do this by understanding where everyone is in proximity to each other and where you are in proximity to risk. We’re also able to drive our centers. If you can almost think of them as nodes walking around this warehouse, we’re collecting environmental information beaconing. And what we’re able to do is essentially provide a spectrum of risk and where that individual is in proximity to risk. These are some of the wearables that we will provide.

Sean Petterson:

The one on the left is calculating heat, humidity, air quality, barometric pressure, height, all the types of insights in high-risk environments. And the one on the right is optimized for comfort, focusing permanent and ergonomic social distancing, contract tracing. And of course, proximity detection for forklifts and as well can be used as a beacon. This is worked across the globe. We’ve had 35,000 individuals on the platform who’ve experienced a 45% reduction injury. It’s 45% less people going home, 45% less people taking opioids. And we’re driving true ROI. Our clients will see a 250% ROI in the first year, and that’s just from injury savings from the use of the platform in driving data into the operations, based on the wellbeing of these individuals. As we kind of look further and looking for new ways, our insights here, that if you can imagine we can manage it.

Sean Petterson:

The next time logical step for us is improving the insurance value chain. And we do that with data. We are bringing insurers closer to the risk spectrum. So we’re giving them insights on risk selection. We’re providing high levels of granularity around job type all the way down to the SIC code. We’re also helping price that risk based on how much risk is actually happening within the timeframe of that person’s job, we know down to the 12th of a second, what their risk profile looks like, which really helps us out with claims. We’re giving insight forensically, we know what happened before, during, and after that incident, allowing us to drive better alignment and most specifically alignment through risk engineering and risk management, providing real ROI to make decisions that not only reduce their risk profile, but increase the operational profile of the business itself.

Sean Petterson:

So where this has worked for us and in some of these examples of taking safety data outside of just injury reduction into operational change. So in this particular operation, we were in a very high paced warehouse operation, where we found the ability for our platform to perform very well. Just the basis of the platform, reducing injuries in this warehouse by 50% year over year. But we found there’s one location where we can get a little bit tighter, one area where this job function was not quite improving. Our data quickly showed us that sagittal twisting velocities one of the ergonomic factors we collect, we’re just not improving. Training or some form of intervention wasn’t going to change this. It needed to be infrastructural. So we recommended a conveyor belt with a 45 degree bend in it that essentially eliminated the ergonomic challenges and eliminated all the injuries from there.

Sean Petterson:

Not only did it do that, but it also increased the throughput of the individuals on that line. We increased the throughput so much so that we actually eliminated an entire day of man over overage hours. They are paying time and a half for that six day workweek. We brought them back down to five days. The system works because it keeps people aligned. It allows operations to make ROI driven decisions based on safety and savings. And it creates that win-win when we were discussing with data. And in another operation that we found this is one of our power users where they took our data and actually informed operations based on workflow.

Sean Petterson:

We’re able to assess back down to the second, what individuals work profiles were in change their shifts midday, and by dynamically changing their shifts structured based on their physiology, based on their risk, we not only saw a severe reduction in injuries, but we actually saw an increase in productivity and throughput. So what I’d like to leave you here with today is again, expanding the idea of what we can measure, what we can assess and taking these insights from further adjusting to safety into operations and the more we can measure, the more we can manage and improve the overall risk profile of a business. Hand over to you Keith.

Keith Robinson:

Thank you very much for that, Sean. To both Michael and Sean have said is that the POM factor here is about gathering this data at the age and how do we do that. So first of all, I’d like to do introduce a little bit to ADLINK in case anybody’s not aware of us. We’ve actually celebrated 25 years of developing and designing a body to Edge computing technology last year. So our solutions are actually pervasive across everything from the moment the person gets up in the morning to when they go back to bed at night. We have systems involved in transportation, monitoring transportation, autonomous vehicles, all the way through into smart city applications of monitoring and evaluating that. Once they arrive at the factory, we have systems and solutions involved in smart factories, robotics all the way through again, into machine healthcare and obviously worker safety.

Keith Robinson:

So we even touched them in the healthcare environment where our systems are used in surgeries and other areas to help improve and enable the emergency health services to provide a better service. You can’t even make a phone call actually without some of our 5G technology being involved in a network. So once you get to the point of wanting to relax, so we have systems involved in infotainment as well. So truly our solutions and our hardware and software is based across everything in a working day.

Keith Robinson:

To achieve this, we’re very much a technology led organization. About 1900 employees, 30% of them are actually in R&D. Developing new products, new solutions, and requiring us to work with a number of major partners, such as Nvidia and Intel. And obviously companies like Chooch to supply the end solutions. Part of the business around Edge is ensuring that we have the right open systems to enable all of these different technologies to talk to each other. So again, we’re very heavily involved in a lot of open systems technologies, such as Rolls Royce robotic operating systems where we’re not just members, but we’re also providing technology and specification information.

Keith Robinson:

ADLINK term as a solution sets, we enabled a smarter applications term to Michael’s points earlier on is that you need more computing power of the Edge. Transmitting large amounts of video data back up into the cloud. You have significant cost impacts. These systems are mission critical, so they didn’t quite have connectivity. And they’re also as Michael quite rightly said, is we can have them on all platforms. We can have them in different areas. So there’s an entire requirements around having that compute power to based at the Edge and at the actual place of operation. So we have a range of hardware that enables us to control that, to enable gateways and to provide to the base hardware, to enable those software solutions to be carried out. Across that we have a range of platforms of software, and today we’re talking specifically around the VisionLink but then we have another range of software which enables us to control robotics also where and other Edge platform technologists.

Keith Robinson:

One of the things today is Sean was mentioning to gather more data, more information at the Edge is becoming more prevalent. This requires more and more connectivity to different senses, different touch points, vision solutions, and even in the case of some of our examples. So enabling us to talk to different systems, to shut them down, or highlight a risk of somebody working in a particular area. This requires us to be adaptable and also for the solutions to be able to be configured. So way we work our systems is using a piece of Edge software that we call ADLINK data river, which enables us to match and connect not just the sensors, but also connects us to the actual different interface pieces of software.

Keith Robinson:

One of the things alongside this is that as Michael said that they are building the inference training plan for these different solutions. Part of this software also enables them to modify change, basically go up and down and vertically and horizontally without having to go back and redevelop. This gives you the flexibility to enable us to provide the data sets for sure on his team to look at and evangelize.

Keith Robinson:

This also requires a suite of hardware. So we have a range of betty boards and other technologies that allow us to connect to virtually any sensor. We have a range of technologies to enable us to do division capture. And also importantly, we have a range of high powered units that enable us to have the compute power at the Edge technology. And it’s not just about having the compute power, but in the case of oil and gas and other areas, these technologies also have to be extremely industrialized, being able to work at environments of minus 40 to plus 70 in higher vibration areas as well. So there’s a practicality area as well, making sure and envisaging that where this compute power is needed, that we have the capability for those systems to be working reliably and continuously.

Keith Robinson:

I mentioned before that for us, it’s extremely important that we work with a number of other organizations, so like Jeff in his organization and Mike that we work with a number of open systems suppliers and other software suppliers that enable us to be able to talk to these multiples different systems, provide the information which is coming back into to enable those analytics to be configured for the information back to the user. And as Sean quite rightly said enable them to make decisions to assist in ensuring that people don’t get injured in the first place

Keith Robinson:

As I say, we are pervasive across a number of different areas. We’re involved in solutions, everything from counting chocolate chip cookies, as they come off the end of production line and ensuring that they go into the right box on sort of right pallet, and out to the right person to be delivered. From that all the way through to railway systems and a whole multitude of areas and even involved in battlefield scenarios. So it’s always interesting to me to see that the same technology is being used to check chocolate chip cookies and to be used in a battlefield. However, in terms of the worker safety environment and this is very important part for us, is that it’s just not just about worker safety, but you can generate efficiencies as Sean mentioned as well and improve the actual operation, which also partly also increases the safety of the workers.

Keith Robinson:

So I just wanted to give a couple of examples of where we put systems in place. So one of the major areas as we all know is about increasing security and efficiency. So it’s not just about looking at as we’ve got here about sort of making sure biometrically that that person should be going through that for maybe an airport or a railway station. But if you look at a warehouse parts of the solution in terms of increasing work and safety is making sure that that person is not in an area they’re not supposed to be. Using this technology rather than actually just waiting for that person to have an accident is proactive. The system could be proactive in terms of notifying and letting people know that they’re in the wrong location.

Keith Robinson:

Railways are becoming more and more of an important area. They have to be safer. They have to be smarter and they have to be more reliable. So again, we have application systems where we’re looking at how people board trains, unboard trains ensuring that that’s done in a safe manner, ensuring the trains do not move. But also looking at using the technology as well in monitoring how the actual railway operates, making sure that everything is in place, making sure that everything is actually correct and ready to operate.

Keith Robinson:

Is not just about people. When we’re tracking and when we’re looking at things it’s not always about is that person in the wrong place, is that a person going in the wrong direction as he got a hard hat on. Other areas as well as all the actual objects in the wrong place. So an example of this is where we are using artificial intervision, artificial intelligence to actually monitor what they call fault. [inaudible 00:25:39] on runways. So this is very important. Normally it’s carried out by a guy driving up and down in the lorry at different times. We can actually monitor that and then impart the information to say to that that debride is on the runway. As I’m sure you can imagine the risk of a aircraft taking off. So without debride being on the runway is very high and it’s a very, very critical situation.

Keith Robinson:

So again, not only are we actually making the system more efficient, but we’re also making the safety higher and improving on the current situation that’s there. And I think this is the important thing. When we look at this Edge technology, it’s not just about how we can provide it more compute power, how it’s efficient, but 90% of what you look at is also generating a safer working environment for the staff to Sean’s comment. The number of injuries, the number of accidents that go on, but being preemptive in terms of how that operates. So monitoring to say that’s within a warehouse, for example in the last four days, five people that walked within a foot of a full truck is much better to know that, and change the process to stop that happening in the first place. And so it was typically what happens now, you have to wait until the accident happens and then it’s reviewed, but that’s a little bit late in that situation.

Keith Robinson:

And it’s exactly the same in the case of the runway example. If you wait until after an accident with an aircraft, and you’ve got a major disaster on your hand. So it’s about being preemptive and start to pull back that information and data. From ADLINK points of view, what we are here to do is to provide that compute power, to provide that technology and provide that data river software capability so that the data could be aggregated and sent back for the analytics and the information. I hope that’s given you all a few ideas and hopefully opened up quite a few suggestions and thoughts about questions in some of the areas where this technology can be used. So at this point, I’ll hand back to Jeffrey. Thank you.

Jeffrey Goldsmith:

Great, great presentation, Keith. Thanks, Sean. Thanks Michael. So I’m going to look through these questions. The first one I think that we can ask is… My chat window just disappeared for a moment. Let me go back to it. I’m going to post this in chat so you can all see it. The question is collectively, how do you see these technologies working together in one workplace environment? And what’s a real world example of how that can work. Any of you can answer that start off and then you can pass it on to the next person. How do we see these technologies working together? And if you have real-world examples to make it more real for people, that’d be great.

Michael Liou:

Yeah. Maybe I can kick that off. And I’ll speak a little bit on behalf of Sean since we are collaborating as partners right now and I’ve mentioned how we obviously need industrialized hardware in some of these environments. But as Sean aptly pointed out, he focused pretty much on sensor technology and collected millions of man hours of data, ensuring worker safety. We are actually looking to combine the technologies of both sensor-based and visual artificial intelligence together in some of these environments and taking advantage of both worlds in order to ensure that people are complying with the various different standards. So Sean, do you mind like touch a little bit more upon how we’re thinking about setting up some of these arrangements for some of the more complex situations.

Sean Petterson:

Yeah, absolutely. So trends in the world right now is we’re seeing EHS be automated. We’re seeing things that need to be automated just for the case of data needs to be more proactive, but also just with the admin of COVID, it’s challenging to have in-person meetings. So we look to that, we look for things to start when we enter our client’s sites and then we grow up. So the first way is compliance. How can we ensure people are wearing face masks in this new normal, how can we ensure that we have hardhats have safety best when we’re at a distance. We’re able to provide those alerts based on compliance and deny access to sites without compliance right at the start of our site. So we’re able to do that through our smart doc, assess what the individual is wearing and not allow them to check out a sensor.

Sean Petterson:

Once we adhere compliance, we’re out on the shop floor, and now it’s about a dynamic environment. And how do we catch all of these injuries that happen. Injuries happen fast. How do we be faster? So if you think of the combination of our wearables with our true JI setup, it’s almost like having a little bit of a coach or even a guardian angel in the sky, is how we look at it. So we see something that is happening and we send an alert and we avoid the incidents. One good example of that is forklift collision. So we’ll have a forklift coming around a blind corner. We’re able to assess that and process the alert and send it to the individual on the other side. We have redundancy because our sensors also communicate with that forklift and the beacon we have on there.

Sean Petterson:

And we create a full picture of risk. Then taking that a step further, if an incident does happen, we now have video footage to combine with our on sensor body data. And we’re coming up with a very quick processing solution. So if you could take that narrative that drives further. So where cameras can assess where we saw that value now, as we can deliver the insight in real time, it’s not going up to the cloud and coming down into a site where someone needs to send an alert it’s happening in real time. It’s keeping the person safe and giving the action and the data when they need it and recording that data to improve the workflow moving forward.

Jeffrey Goldsmith:

Great.

Keith Robinson:

And I think from my side is as I mentioned for me, we can monitor anything. We can tag anything. We can look at anything. However, the important thing really for us is what is that important information? What is critical to that? So I think the consulting we have here in terms of being able to apply that and use the data that’s coming out and split that together. So to go back and actually say, this is what we should be monitoring. This is what we should be actually looking at in this particular scenario.

Keith Robinson:

So it’s great having sensors, it’s great having technology. But I think to Michael’s point when we’re talking about fire alarms earlier on. So you have to wait for the fire to be there before the fire alarm goes off your [inaudible 00:31:54] your detecting, but it’s too late. So I think the big thing for us is how we can work with the other organizations to look at that data, look at that set up and look to see how we monitor the right things at the right times. And I think that’s a good thing for all of us to put together and ultimately save more lives.

Jeffrey Goldsmith:

Okay. So I’m posting, and that great answers guys. I’m posting the next sort of question that I’ve created from a little different questions. Are all of these hardware and software solutions custom designed for each application or are they all standard offering? So are they pre-trained pre-packaged or are they custom? Who wants to go first in terms of that question?

Michael Liou:

I’ll start. Actually, if you kind of asked this question about seven months ago, everything we were doing was pretty much customized for people and life to date. I think we’ve both over 4,000 custom models for our clients. But as we speak to our system integrator partners, our chip partners like Nvidia are OEMs. And we started seeing certain demand signals. Like consistent need for certain types of applications. And we realized that we could actually produce a fair number of models, what we call pre-trained ones that don’t require a customer proprietary data. We’re able to scrape data from the web, both video and imagery, and actually train these models up. And then when actually it’s deployed in the field some minor adjustments need to be made for that type of camera, the camera angle per se.

Michael Liou:

But as a result, we decided to kind of pursue a parallel path and has rolled out a number of different pre-trained models. So in addition to PPE, we can throw in obviously fire and smoke that I mentioned earlier, weapons detection, slip and fall, even puddle detection to prevent people from falling. My alert people that there’s a puddle there we can customize ensure that people aren’t too close to certain conveyor belts or saws in order to keep a certain type of distance. We’ve been receiving a fair number of oil rig inquiries lately. Probably one of the harsher conditions where we have operated we’ve actually deployed in some desert like conditions on oil rigs, detecting smoke fire flares and leaks, as well as doing a license plate vehicle and face authentication from a security perspective.

Sean Petterson:

Yep. And to put it into perspective between what we’re able to assess with real-time wearables and what we can add to the picture here with huge, we essentially can assess 85% of all industrial risk. And that is a lot of data. It is a lot of information and it’s all about how do you deliver that insight when it needs to be delivered and in a way that people can interpret it. So when we deliver our solutions, we don’t deliver a solution for industrial athletes or even a solution for warehouse workers it’s specific to the job code. So we’ll go down into a specific as something AD18, which is warehouse and grocery, and we’re able to come up with a specific solution for that group. We’re able to deploy the right model because we know that the person stocking a shelf has a different risk profile, a different thing to need to assess than a person on a seven story high-rise.

Sean Petterson:

So in a high-rise, we’re able to find the alerts like, are they wearing a false safety harness? Are they wearing the right type of PPE? Are we looking for fires? And then when we go into the warehouse, it’s a different model that’s influencing and promoting the right type of alerts for things like forklift collision, ergonomic challenges and slip trips and falls. So that’s how we work together. We prescribed the right solution based on the injury profile that comes down to the job type and the historical data that we have to reference.

Keith Robinson:

In terms of the actual Edge interface. We have our platform, our solution platform which as I mentioned before, has a data river in it that is designed to expand and develop. So we have a data connectors for connections back to the main software solutions, and then that fits in that. So as we roll out, we can use that repeatedly for new applications, new environments. So even adding in new connectors and new sensors, et cetera. So our inference engine is designed again with an SDK in it. So if Michael and the team needs to make changes, they can adapt as they go through rather than redevelop from scratch. So it’s not a completely custom system. And as we build out, we have more and more applets and connectors to enable us to very quickly use that in different application environments. I think you’re muted.

Jeffrey Goldsmith:

I’ll unmute myself. So I’m putting another question in chat for the panelist. And this is three questions kind of combined. Essentially these questions relate to surveillance, video surveillance and invasive observation of people. And how has this been perceived in different parts of the world, whether this is US centric or applied globally. So essentially, how is this being received from a privacy standpoint around the world?

Sean Petterson:

Sure. I’ll take that one. And this is actually a great reason why we love our partnership with Chooch. And just because of our ability to maintain an anonymity. So our platform is GDPR compliant. We’re ISO certified, and we’re essentially able to anonymize our data and roll it up to the manager so that the individual is always protected. Machine vision and using cameras has always been a desire of ours, but always a challenge because it feels like they brother. What was really unique here is that we do not have to be big brother. We do not have to tattle on the individual. We don’t need to tell on them for something that’s inaccurate. What we’re simply doing is processing something dangerous and sending an alert directly to the individuals. So we don’t have to go and report something of a bad actor and that ability to provide extra assessment, extra comfort to that individual without the necessary punitive action that may be associated with it is the reason that a hardware combination with machine vision is the wind that our client like.

Keith Robinson:

Yeah. I just like to add that it’s a question that we receive a lot. We have a number of different models, including facial. One is the deployment of various different AI models into your GPUs and cameras is totally customized by user. So if we don’t have facial model loaded in there, then we actually can identify who you are. We may just the PPE equipment, we can not detect whether people are working under suspended loads, or the number of people working in an area where there’s a high degree of a forklift traffic as an example. But it’s important to note that that’s a decision that is made second by management, one of the issues related to facial is obviously the high degree of bias. And so we’re quite sensitive to that. So if a team set out to build it’s facial model, they actually took it, built their video base from over taking over a million faces that were culturally and ethnically diverse.

Keith Robinson:

And thereby reducing that bias down or increasing the accuracy up to 99.7%. Separate topic of course on privacy is a whole separate topic. I think people do have greater concerns in public spaces where there are security cameras and a potential to what recognize folks, but outside of the China, UK is pretty wired up. The United States is actually getting there. And so usually when you enter even a public space, you’ll usually see a sign that says, this premise is monitored by video security. What really is changing here is that potentially if you decide to implement facial identification, you are replacing people watching the monitors software.

Michael Liou:

I’m living in a country which has more CCTV cameras per capita than anywhere else in the world. I suppose it’s a slightly less of obstacle for me. I think the other thing as well is that we’re actually carrying out the inference engines, et cetera, at the Edge of the talks about. So again, we’re not streaming back loads of video data that’s being stored and maintained, all we’re passing back as the information that, which is contextual around the data we’re gathering. And I think that’s another thing which alleviates some of the concerns in this area that there isn’t multiple amounts of data being sent all over the cloud and then the other areas. And I think for a number of people that seem to be something which is important to them, it’s knowing who, what, and when, or actually maintaining and holding those database rights.

Jeffrey Goldsmith:

Okay. That’s great. So these are really great answers and I really appreciate the questions that are coming. So here’s a couple of other questions that are sort of around the same topic about how do these platforms work together. Does the AJAI run on dedicated Chooch hardware or does it run in different products? And also how do they StrongArm products communicate with Bluetooth or cellular, et cetera. And where will the data eventually be analyzed or stored? How will the data be exported? So this is really a question of how these platforms work together, how they communicate, and then where does the data go?

Michael Liou:

I’ll start off because it’s the shorter answer. And that is we have a great OEM relationships with people like ADLINK and others. And we are not in the hardware business. We’re also not in the hardware installation of camera installation business, and we’re not really in the OEM system integration. We’re a software company. And so we rely upon these partners to help us implement a solution together. But essentially we are optimized right now for Nvidia GPU’s across their entire line from the small just in line to server and any OEM that currently uses the Nvidia GPU’s we can definitely run our on models on both on the Edge and your Edge.

Sean Petterson:

And our deployment process is very straightforward. So we’ll deploy our smart doc. That is the interface that the end users will see. And that’s the only interface that they’ll have to interact with. Managers will get downloaded to their safety and their information reports through our platform. And we charge a simple SAS fee. That SAS fee changes as we develop more models. And as we grow it’s very buildable based on our risk profile. What we’d like to do is provide a plan of continuous improvement for our teams. And as we look for new areas to improve we’ll work with partners like ADLINK to help us implement new infrastructure, to help understand each new incidents, train new models, learn about new job types and grow that plan of continuous improvement with the life of the organization.

Keith Robinson:

Yeah. So from our side as Michael said, the software sits on our technology. We project some platform devices and other various items that depend on the requirement. So the actual software is sitting on our hardware at the Edge in that particular circumstance and the software interacts in that vein. So it’s a really a model monitors solution that we’re providing here in terms of data connectivity.

Jeffrey Goldsmith:

Great. So here’s a question related… It’s sort of a business question. What are the biggest obstacles to making a sale? And what are the three top three reasons the solution is superior to competition? Who is the competition for this? So what are the obstacles and who are we competing with? What are the compelling reasons why StrongArm, ADLINK, Chooch?

Sean Petterson:

That’s great. So biggest obstacles for us in the sale is finding the injury profile to start with, and the challenge to start with, and then helping that client see the pathway. The wearables industry frankly, we don’t have a lot of competition that’s combining both of these functions, but we do have a lot of competition in very specific wearables for specific types of injuries you’re trying to solve for in specific industries. So in construction, you’re working on things like vibration, and then you’ll work on ergonomics. Then you work on fall safety and then you’ll work on building inspection. And the IOT groups that are in charge of those are very disparate. And you find it’s very challenging to bring all those things into one paint. What we’re hoping to do is help provide a single pane of glass with one dataset that can unify all fronts and build off of that.

Sean Petterson:

Our safety score is based on the time series database we have over 25 million hours tied down to the SIC code that enables us to start in a profile that everyone understands. Everyone understands ergonomic injuries, and we can build upon that. We can build upon based on for the safety score, and that’s how we grow moving forward. So our barriers become from there educating that workforce and educating the groups so that we know where we can attack it. It’s surprising to learn that a lot of the industries we’re in, they don’t necessarily understand the risk profile. So for us, it’s a matter of showing them that you need to measure it. And when you can measure that, you can put a delta to that ROI. And then from there, that’s our budget to work for on innovation. And that really has been our entire move forward.

Sean Petterson:

Now, as we partner with Chooch, that exercise starts to condense because we’re assessing more of what’s inside of that risk profile, and we’re providing a bigger web for new clients to enter and help them understand what connected safety means. It’s not just a widget anymore. Wearables are a piece of industry 4.0 now, but how do you fold this into the greater operational bend? And that’s what we’re doing here. And that is hands down. I think our biggest competitive advantage outside of the fact that we can go to enterprise scale. We can scale to tens and hundreds of thousands of users, which is a whole nother challenge when you talk about data privacy and security. And that is something that we built here in a very robust platform.

Michael Liou:

Following up on Sean saw comments. I mentioned earlier, we are a horizontal AI platform. So there are plenty of competitors out there. Often they focus on one particular solution or one particular vertical developing a great solutions. But one of the advantages that our platform offers is just kind of a horizontal industry agnostic approach. We can solve not only today’s problems, but tomorrow’s problems. And as you start adopting more AI into your corporate operations and undergoing digital transformation, you still have a key player that can deliver additional solutions. Ultimately, in talking to some of the OEMs and SIS out there, people don’t want to have seven or eight or 12 different providers doing different things.

Michael Liou:

It’s much more convenient and elegant to have one that has a uniform interface. Every time a prediction is generated, we generate a Jetson file that has links to the JPEGs, as well as the class title, the inferences thresholds, as well as the XY min-max of each of the objects that are detected. So it’s very, very uniform. And lastly a lot of AI companies despite developing some really great product, it does take time. It takes time to code that. And often it takes three months, six months to get something working up and running. And we often can do that within a day. So that’s something that our competitive advantage here at Chooch AI.

Keith Robinson:

From our technology point of view, there are companies out there that make similar technology. I think where we’re unique in terms of its environment is that we, we not only provide the PC, we employ the Jetson technology, but we also provide the middleware platform and the application capability. So from that perspective, it’s so unique in terms of being able to provide the complete data sets across the board on that side. I wanted to pick up on the comment about what is the biggest competition. I think Sean actually mentioned for me, what I think is one of the biggest competition is actually organizations realizing the importance of doing the risk assessment and the actual risk verification behind it. When we’re involved in this and when I’ve been involved in this, that’s the biggest eye-opener for most of the organizations that I go into for them to see initially that there is this risk assessment, or there is this requirement behind it. So for me, the biggest sales stop is actually a customer not realizing that he needs it in the first place and what benefit it can be.

Jeffrey Goldsmith:

That’s great. There’s a number of questions around challenges. There’s a question about what’s the most challenging environment for doing these sorts of deployments? There was another question about physical obstacles when trying to detect workplace safety. They are pretty related. And then there was a third question around falling object debris. How real time is that? I think these are somewhat related, so challenging environments, how real time is this and what do we do about obstacles? So obstacles in an environment I could imagine something being in the way of a camera. And I think go ahead Michael. From my point of view, that’s about having multiple cameras and multiple sensors and backup systems to avoid having the obstacle be an issue. But how does everyone feel about difficult situations? Do you have to have [inaudible 00:50:08].

Michael Liou:

Jeff, I think you’ve touched upon it. I mean, we haven’t developed the AI in camera to look around corners quite yet.

Sean Petterson:

Jeff going down.

Michael Liou:

Can you guys hear me?

Sean Petterson:

Okay. Yes, go ahead.

Michael Liou:

This is Michael Liou. So just to repeat the question, like how do we deal with obstacles or where views might be obstructed? And I was just kind of joking and say, we haven’t developed the AI to look around corners quite yet, but I think Jeff did touch upon having more cameras are in key locations is certainly important. I think having a complimentary sensor technology obviously can compliment that ability. And lastly, there are cameras that are mounted at higher angles, seven feet, eight feet, and access one of the world’s largest camera manufacturers actually has ceiling mounted cameras. So I think between camera placement and being thoughtful about what you’re looking for a couple of with some of the different types of manufacturers, we can probably solve most of those types of challenges. I’m sure Sean could talk more about the sensor approach that can compliment our visual AI approach.

Sean Petterson:

So I think first and foremost where cameras can’t measure, StrongArm is constantly collecting a stream of data on a time series database. So we connect the dots with that by being retroactive. As we look towards the challenges with developing infrastructure, we have to convince clients to put a new infrastructure. But we do that by pointing to an ROI. So we’ll point to our ROI on a safety savings basis and we grow. Now as far as most challenging environments, it’s about the dynamics. When we talk about beaconing and tagging and looking for other forms of risk outside of the individual, it’s very easy for us to put a sensor on a bulldozer or a fork shock. Where that is really challenging is now in these more dynamic environment. So our most difficult environment wherever in was similarly oil, it wasn’t a pipeline dig.

Sean Petterson:

So these folks were down between 10 and 15 feet in a giant hole where they’re digging a pipeline. And we’re measuring their ergonomic functions. We’re measuring gas exposure, [inaudible 00:52:23] while we’re down there. But the biggest challenge quite frankly was actually crushing injuries. And we can’t put a beacon on a pipe. Especially a beacon on a pipe that was just put down there 15 minutes ago in the rain, in the mud. So if we’re able to stand up solutions where we can dynamically understand this changing environment ie with Chooch that enables us to adapt to these dynamic environments and send alerts. And that’s where this infrastructure comes back down to speed the ability to train faster, learn about the environment quicker and prioritize alerts based on the environment and not just carte blanche type solution. We’re tailoring it based on the dynamics. And that’s the difference between a pure hardware solution and a machine vision in hardware solution.

Keith Robinson:

It’s very difficult for me to add to those answers. I thought they covered pretty much everything. I think the only thing for me to add from that side is that again, the technology we’re adopting here is industrial technology as well, because it’s not just about whether the camera can see around the corner. It’s whether the actual technology can be used in some of the harsh and varied environments we’re talking about. So another key in terms of making sure these systems are operational, is ensuring that they’re vibration tested. They have capabilities working across large operating temperatures. And that also enables you to as I think Sean was saying before, when you’re looking at putting in the camera technology, et cetera, to ensure that the rest of the system can fit in the right location, the right place to give you that adaptability from the initial fitment and installation. And then obviously with Sean’s comment too, before taking that data, I mean, able to be dynamic in terms of how you adapt manages around most of those issues.

Michael Liou:

If I could add an addition to on the oil rig example, which we think is a harsh environment we have gotten a number of inquiries from a construction companies and home builders. And these obviously will be more harsh conditions as you got a lot of heavy machinery and dirt and debris kind of lying all over the place. And so we’ve told with the idea of creating like a portable server that we take from site to site and during different phases of construction, implementing different types of models to ensure that people again are compliant. Like for instance, we’re in the right PPE gear, making sure people are willing to right invest for a certain type of function, not using your cell phone if you’re near an earth mover.

Michael Liou:

And then when the foundation gets laid a whole other set of models and then as each floor gets constructed and then putting these cameras and portable surfers on these floors as a masonry occurs and tiling occurs and other different job functions. Obviously all of these other functions have a fair amount of debris dirt and it’s not your typical 35 degree or 40 degrees surfer room for sure.

Jeffrey Goldsmith:

Okay. Well, we’re almost at the top of the hour. This has been a great discussion. Do we have time for one more question? It’s about five minutes before the hour. Yeah. So there’s a very specific question about a specific industry within the group, I’ll post the question to the analyst, but I’ll read it to the group. Within the grocery market there’s a clear demand to move smart AI camera technology within stores and distribution centers and outside of the health and safety parks, would this technology offer elements such as age verification, empty shelf restocking alerts, item recognition for theft, cue alerts, customer emotions, inclusion of emergency services for medical issues fires [inaudible 00:56:02]. This is a general question about what can we do for retail.

Sean Petterson:

Maybe I could pick up on a couple of comments and notes. Interestingly, we’ve already done the applications where we’re actually using it for the division AI technology for pricing food in the cafeteria, for example. So you put your food on the plate, it goes through and he takes it and says, you’ve got this amount of potato, this amount of Brussels, therefore we’re charging that amount of money. So in answer to that, it’s possible to detect and look at that sort of from the Edge and the other points of view. That the retail sector I think is probably going to be an area of never ending possibilities from looking at string cage coming back to the COVID requirements, self-checkout systems exist, which has become very popular over in Europe. Again, there’s an opportunity there to move forward in terms of automatic detection.

Sean Petterson:

So rather than having people going into confined areas to self checkout, et cetera, again, using this technology to do that as they walk through is more than capable. And I’m sure this is something as we come back into and particularly the retail industry in general, not just the grocery, but the fashion industry where they can be detecting people, picking up for instance items of clothing and automatically making suggestions on video walls, et cetera. You now have picked up this jumper, have you thought about this pair of trousers or match with it. And again, the number of examples are countless. And I think that’s a big, big area for being looked at in the long.

Sean Petterson:

There’s so many opportunities here. Again, in terms of DCs in particular, where you’re looking at making sure there’s verification in terms of what’s going out the door. You mentioned before about pallets, et cetera. There’s a whole area there of new technology, which is noninvasive and enables you to do that. I’m aware of the time. So I stopped so as one of my colleagues have the last couple of minutes called comments. Thank you.

Michael Liou:

I would say the answers are yes to all of them. We have our series of pre-trade models. We actually have a demographics model that will determine gender and age and a sentiment, we will estimate that for you. We’ve received inquiries on cue analysis and how can we open and close different queues based upon loads of groceries within shopping carts, based upon as a cashier typically slow, medium or fast coupled with how many lanes are open, but all this is doable kind of grammar analysis, I think is fairly straightforward. We’ve got inquiries along that as well, too. So the answer is yes, any type of human, physical task of analysis inspection which can be taught can be certainly replicated within our models as well, via models as well.

Jeffrey Goldsmith:

Well, I think this has been a great discussion everyone. I want to thank the panelists, Seth, Michael and Keith. Sorry, Sean, my bad. It’s been a great discussion. I want to thank all the audience for attending. We had about 80 people on the call. And that’s fantastic. So thanks everyone for attending. If you want to follow up with questions after the webinar we are completely available. Please contact us at StrongArm tech, at ADLINK or at Chooch AI, and we’ll get right back to you. We’ll be sharing this on our blog and the email once the recording is live. All right. So thanks everyone. [crosstalk 00:59:40]. Bye-bye.

 

×

Oops, something went wrong.
Please try back later.

Click here Tap here to upload Upload your image
and Chooch will recognize it.
Supported formats are .jpg .jpeg .png

×