Pairing robots with AI: The story of robot dog, Spot
It's set to be a New Zealand first – a walking robot for commercial purchase that’s being used to perform real-world jobs. Equipped with advanced sensing capabilities, Spot is designed to help perform the many unsafe and arduous jobs in the workplace. Here’s Jourdan Templeton, of Aware Group, on how they've led AI for Spot here in New Zealand.
Credit: Boston Dynamics
Exactly who – or what – is Spot?
Spot is Boston Dynamics’ first commercially available robot. The physical mechanics and design of Spot was developed by a company called Boston Dynamics. They’re a US company that have been around for decades – you should see some of their creations, so nimble and surprisingly agile.
You can basically click-and-buy from Boston Dynamics right now. It’s a bit harder to get them into New Zealand – and that’s another reason Aware Group partnered with them to make that happen.
Aware Group has a very specific interest with Spot: that’s how do take the out-of-the-box Spot and enable it to be customised to specific industries.
Out-of-the-box Spot is essentially dog-shaped robot that can walk around and avoid obstacles. Aware Group is building its solution to turn it into something useful, with social and business benefits.
Robotics is so new to New Zealand still, especially that there aren’t really a lot of tools and platforms available to help you deal with controlling and managing the robot right now. For this we partner with New Zealand company Rocos.
What can Spot do, what can’t Spot do?
Out-of-the-box, Spot can do a lot. Spot can walk around, hop and run. Spot also have 360-degree collision avoidance built-in, which means that if the robot was told to walk into a wall it would stop within the threshold that you’ve configured. It can also walk up and down stairs because it has depth-sensing capability.
Spot detects objects in front of it. Spot’s capabilities come down to the types of sensors that are mounted upon it.
There are some things Spot can’t do. One is that it can’t walk through glass – glass looks invisible to Spot. Collision avoidance is usually done using like for example infrared light mapping and light usually passes through glass.
To test Spot out as we’re building up capabilities such as these in the office, Spot has its own little playpen.
Battery life is another limitation of Spot – Spot only lasts really for an hour and a half before requiring someone to swap the battery, it’s not a limitation at this point as for most scenarios this is enough time, but if you needed the robot to run all day it would be.
A bit of background into Aware Group – who are you, what do you care about, what is problems do you work to solve?
All the work we do comes back to data. That’s the main story we tell, it’s about data and it’s everything from data creation. Internet of things in robotics, through to warehousing and visualising that data, creating new information. Machine learning and AI are the big areas for us.
What we’re known for is how we use that data. We’d like it to always be people centric. We obviously want to produce solutions that are useful for people but we want to do it in a way that is privacy compliant and respects the privacy of people.
The crux of what we focus on at Aware Group to cater to specific needs and industries. We’re about making the robots smarter for practical use cases.
Computer vision is a pretty common thing that a lot of companies are doing these days and there’s a lot of research and development happening in that area. We’ve also recently had an awesome couple of projects in the computer auditory space or the computer listening.
But the big focus for us recently has been getting into computer or machine olfaction, which is computers having a sense of smell so the idea that machines can now detect odours.
A myth about robots that needs correcting?
Robots are already more present than people think.
Yes, Spot’s four-legged, shaped like a dog because that mechanically is found to be the best fit. But there are other plenty of other robots, on wheels for example. Even an elevator would be considered a robot: it’s a piece of machinery that does a task and is fit for purpose.
We’re going to see a lot of different form factors with more robots integrated into things that we take for granted today.
The future of robots?
We’ll also see a lot more of these senses being integrated into robotics and AI because want to create experiences that are personal for people and all the experiences that will enable people to be supported for a wide range of needs. So, by giving the same senses to these robots or machines, they can mimic the human experience and react to something and it’s in a similar way that a human would. This might sound strange to some – but it will be very helpful to both social and business needs going forward.
In that example with the beginning and with the kiwifruit orchard, you know taking photos of fruit and counting the fruit that’s easy that’s easily solvable with technology today but part of what we’re focused on is actually smelling the kiwi fruit and then using that to determine what time to harvest or whether it has any disease present.
Even though big changes are happening, there are also an expanding range of jobs associated with robotics. Drones need operators, photographers, pilots, for example. Robots need people who are trained to maintain, drive and run them.
Pairing AI and robotics have an important role. We’re not saying all robots will be autonomous, yet there are big changes happening and new benefits that both public sector and businesses are realising, that robots can work 24/7 and take less breaks, that kind of thing.
With these massive changes, there must be opportunities to up-skill people and introduce new types of jobs.
It is unfortunately true there will be some job areas that will be automated such as factory tasks, manufacturing tasks. We are in the midst of another shift, similar to the industrial revolution that led to factories.
We believe in creating solutions that always deliver significant value, remain affordable, and will inspire new innovation.
Data, AI, BI & ML
Artificial Intelligence and Machine Learning are the terms of computer science. Artificial Intelligence : The word Artificial Intelligence comprises of two words “Artificial” and “Intelligence”. Artificial refers to something which is made by human or non natural thing and Intelligence means ability to understand or think. There is a misconception that Artificial Intelligence is a system, but it is not a system. AI is implemented in the system. There can be so many definition of AI, one definition can be “It is the study of how to train the computers so that computers can do things which at present human can do better.” Therefore it is an intelligence where we want to add all the capabilities to machine that human contain. Machine Learning : Machine Learning is the learning in which machine can learn by its own without being explicitly programmed. It is an application of AI that provide system the ability to automatically learn and improve from experience. Here we can generate a program by integrating input and output of that program. One of the simple definition of the Machine Learning is “Machine Learning is said to learn from experience E w.r.t some class of task T and a performance measure P if learners performance at the task in the class as measured by P improves with experiences.”