What have The Robots ever done for us?


Robots can address real-life human problems

Dr Fernando Auat Cheein, Associate Professor in Robotics and Autonomous Systems

Early ‘robots’ in agriculture were actually automated harvesting machines. We’ve added intelligence and decision-making capability and now robots harvest, plant, seed, prune, do herbicide management, monitor and observe the characteristics of crops – all the main agricultural tasks.

Connecting those robots into the cloud means the robot gets information from sensors and allows farmers to make future decisions based on real-life data.

For example, you can distinguish between what should be harvested and what should be left in the field – and estimate the amount of harvest before, for example, picking fruit, to plan better and reduce costs.

Robots are not replacing people in agriculture; there aren’t enough humans to do the work and that is becoming a greater problem. Globally, younger generations don’t want to work in the fields. It’s better to start doing things with robots now than wait until we don’t have a human labor force.

Robots can also do tasks humans shouldn’t do, like carrying 20kg of fruit on your back.

You also have environmental benefits as robots are largely electrically controlled, rather than using combustion engines.

One challenge is that agriculture can be resistant to change and wants clear evidence that technology works. If it doesn’t work, you have to wait until next year, and try something different.

Another challenge is that different fruits and vegetables need different solutions – in different countries. If we design a robot that works in Scotland, where we have specific weather and soil, it might not work well in Brazil or Australia, where the weather, soil and humidity are so different.

So we need technology to help countries where there isn’t enough water, for example, to spread herbicides efficiently so they are not damaging soils, and to monitor all the different indicators.

Ultimately, it’s about using robotics, AI and data to make agriculture more efficient and deliver larger, better-quality and more predictable crops. The world’s population is growing and we need more food.

In Europe, where land is scarce, we’re taking this a stage further into vertical farming, where the inputs are controlled in an industrial process. That’s developing all the time, but there is still so much that robotics can do for traditional ‘field’ agriculture.

Helping blind people to ‘see’ the world around them

Verena Reiser, Professor in Computer Science

We worked with the RNIB on an app called Be My Eyes which blind and partially sighted people can download to connect them to sighted volunteers. They take their phone and point it at, for example, the back of a cereal box or medication and the sighted person reads it – or tells them what it is. Human involvement limits when you can do this and if you slip in the bathroom, you don’t necessarily want to call a person you’ve never met.

So we thought ‘What if we had artificial intelligence available 24/7? How could that help people to live more independently, to be their eyes?’

We’re focused on three research challenges. First, how can these models be improved for use in a conversation? When you talk to someone on the phone, you ask follow-up questions, not just one. How could we make the AI ​​conversational?

Then we asked whether these models can adapt to new tasks and new users. They are pre-trained on a massive data set, then almost frozen and fixed. In the real world, things change around you. You might move house, or travel to a different country. How do robots adapt?

The third challenge is quality. Current models are trained using high-definition pictures. If a partially-sighted person is the photographer, they don’t always take great pictures because they can’t really see what they’re photographing. Pictures might be blurry, rotated or partly obscured.

The overall aim is to help people to live more independent lives, to have an assistant or companion there for them 24/7 – and to live safer lives. The model can make mistakes; they can’t be super-confident and need to be able to communicate uncertainty to warn the user to be careful.

We want to make this a reality – to help people live fuller and safer lives where they’ve got more opportunities to understand and interpret the world about them more effectively.

We’re also looking to develop the social companion aspect – using AI and robotics to chat to people, working with Alana, a spin-out company from Heriot-Watt.

Doing important jobs in hazardous environments

Dr Sen Wang, Associate Professor in Robotics and Autonomous Systems

When it comes to maintenance checks on offshore wind turbines and oil rigs, it’s really dangerous to use human divers in deep, dynamic and dangerous seas.

Having a robot working underwater, connected to an operator, is much safer and more efficient. The robot can send back data to monitor these assets more effectively, and identify repairs when needed in a more timely way.

Underwater robots use cameras to collect visual data from underwater structures, providing the remote operator first-hand data to understand what’s going on. They have other specific sensors designed for underwater like sonar, which uses sound waves to help the robot understand the environment, even in very bad visibility.

These sensors also help the robots navigate safely around the structures.

There are specific robots for specific jobs, like pipeline inspection. The data they provide means we can see defects and cracks. We have machine learning algorithms to automatically detect that and relay it to the human operator, who can look in more detail.

For wind turbines, for example, you are deploying them for 20-30 years and you need to know they are in a good condition throughout their working lifetime.

So there are safety and efficiency benefits, but also cost benefits; it’s expensive to send out divers and boats out to do inspections. Autonomous robots can be set out in small electric boats – safer because there is no human on board, and greener than large boats using oil.

Our challenge is bringing down the cost of robots. Basic inspection robots are fairly cheap, but high-end inspection robots can cost well in excess of £500,000 – but we know the costs will come down significantly over time.

We are currently carrying out inspections using robots, but the next step is those robots carrying out underwater repairs and maintenance, either autonomously or working with a remote human operator.

More generally, robotics has a big part to play in developing our wider understanding of marine biology in the deep sea environment, as our current knowledge is very limited.

Getting more accurate diagnosis of UTIs

Professor Lynne Baillie, Professor of Computer Science

An estimated 150 million people globally are affected by UTIs (urinary tract infections), which can cause severe problems, including sepsis and kidney damage.

There are many symptoms and UTIs are hard to diagnose accurately. You need a lab test, so it’s not a quick result. In the meantime, antibiotics are usually prescribed, whether or not a person ultimately has a UTI.

There are major issues around the over-prescription of antibiotics. It goes wider than those people who actually get UTIs because we get more and bacteria-resistant antibiotics.

So UTIs put a lot of pressure on the NHS, which is why we want to collect and analyze data about people’s daily activities to get more accurate diagnoses.

That data might include how often a person goes to the bathroom, how much they move around the house, if they are doing their normal activities, sleeping more than usual. When we bring all that data together, we might suspect a person has a UTI, then we use an intelligent agent like an Alexa to ask additional questions to confirm that.

We’re starting the project with patients with fitted catheters as they have more chance of having UTIs. We’re working with Leuchie House (in North Berwick) because most people who go for respite care there are catheterized.

Some rooms at Leuchie House already have sensors and smart devices, and we’ll collect data as a first step to seeing if we are in fact correctly diagnosing that people have a UTI.

We’ll also work with Blackwood Homes and Care, with people receiving different levels of care or assistance, including those living pretty independently, but who still have an alarm or similar device

After that, we’ll look to go into everyday households. This project runs to 2025 and we want it to be real, not theoretical – something that’s adopted by organizations looking after older people. The NHS is on board with the project as Dr Steve Leung is co-investigator on the project. Everyone is committed to delivering this to patients as soon as possible.

We’ll use the assisted living flat at the National Robotarium to analyze data collected through the sensors and intelligent agents. What is the impact on energy use? Is it costing too much to identify this issue? It’s an important balance.


www.scotsman.com

Related Posts

George Holan

George Holan is chief editor at Plainsmen Post and has articles published in many notable publications in the last decade.

Leave a Reply

Your email address will not be published. Required fields are marked *