Five trends in industrial robotics that are helping to transform manufacturing

Today a Meltwater search of ‘robotics’ headlines tallies 24 results across 12 titles including BBC News, Financial Times and HR magazine. We’re dedicated to following the robot trends across all industries - whether it’s AI being used in finance or robot skeletons being used to help paralysed patients walk again. However, as an agency we are particularly focused on the manufacturing industry where many of our clients are helping UK companies to achieve faster and more flexible production.

There are a number of areas where robotic technology is developing at a rapid rate due to a demand for greater flexibility and speed.  It was difficult to come up with only five because there are many different industries within UK manufacturing that have all got the potential to use robots. Nevertheless, we managed to narrow it down. Here are the five that we think are the most exciting to track right now:

Collaborative robots
Initially collaborative robots conjure up visions of smaller robots working alongside people. There are a number of models which have been developed to bring the collaboration to many new areas of production such as electronics, pharmaceutical and automotive as well as small to medium sized manufacturers or workshops. Some of these are able to react to potential collisions and others are ergonomically designed so that if a collision occurs they won’t impact the co-worker. 

Lesser known collaborative robots are the large-scale industrial sized robots which are fitted with sensor technology so that they can stop before a human gets within a certain radius. There are even researchers who are exploring code which make robots interact closely with humans - see Madeline Gannon’s work here.

If larger robots are able to collaborate with us, then we could be lifting cars with a wave of our hands in no time.

Machine tending
Robots are able to be adapted into different configurations according to the needs of a customer. In the machine tending industry, there are many different end tools required and robot manufacturers are creating cells which are especially adaptable for this purpose. 

There is also a growing skills gap. In this industry many companies are introducing robots to perform the manual loading and unloading so that the skilled employees are able to use their expertise to perform other work steps.

Digital maintenance
With the advent of smartphones and 4G, the possibilities for maintenance engineers, factory managers and CEOs to communicate with elements around the factory floor has expanded. When 5G lands, expect more possibilities. But for now, we are able to share with you that remotely monitoring robot performance is a thing. This is achieved using data analysis and as robots are basically told what to do via a form of data (code), it is possible to analyse this data and provide insightful statistics such as how fast the robots are performing and how many parts have been processed.

A lot of us are getting used to analysing data in the form of social media analytics, for example - if there can be this much insight into engagements on smartphones to drive changes in the way we interact with each other, then analysis of robotics performance could mean great changes for the way that manufacturing is performed - and all at the tap of a screen.

Warehouse logistics
As many of us are ordering goods online whether it’s new clothes or a new sofa, warehouses are having to quickly adapt to manage all of the incoming orders. Warehouse automation has become the differentiator for many online brands. Leading names Ocado and Amazon, for example, have invested heavily in robotic technology. Ocado even has its own innovation department within which they are developing their own robots. 

As more of us come to rely on shopping being delivered to our door, greater numbers of warehouses are going to need robots to maintain their position in the market.

Here’s a video of Ocado’s robots in action: https://www.youtube.com/watch?v=4DKrcpa8Z_E

Food and beverage
As food trends proliferate from veganism and vegetarianism through to the paleo diet and sugar-free, the demand on food and beverage brands to continue to churn out relevant products means flexibility is key.

Robots are adept at providing flexibility. They can pick, pack and place products using vision technology which recognises various shapes and sizes. It all comes down to the programming - which is taking less and less time thanks to innovative programming software. Robots also bring the speed - so if a confectionery brand needs to release a timely limited edition chocolate bar, they can do so without too much hassle.

To understand more about what robots have to offer the food and beverage industry, watch this video from Wired https://www.youtube.com/watch?v=SKBHnbYo-4s


AI bias and a new agriculture: ‘AI: More than human at the Barbican’ review part two

Over the last few days we’ve scanned many headlines which herald the future of artificial intelligence such as CMR Surgical’s £1bn Series C funding, a company based in Cambridge that is set to launch a surgical robot and Softbank’s plans to open a cafe run by humanoid robots in Tokyo. These headlines are unsurprising - fast developments in AI technology mean that what was sci-fi literature fifty years ago is now becoming a reality.

Nowhere is this easier to comprehend than an exhibition dedicated to the technology. In August we made the most of the longer evenings and made our way to the Barbican for ‘AI: More than human.’ Situated within the Barbican Estate of the City of London, the Barbican Centre has a large space fit for hosting thought-provoking events showcasing cinema, theatre, dance and art.

So when we arrived at the venue, our brains were already switched on to learn more about AI and how it’s transforming the world around us. 

Here’s the second part of Account Manager Rose’s review of the exhibition.

Through replicating the human brain, scientists were able to develop the first ‘neural network’ in the form of computer programmes in the early 21st century. Here we were, three quarters of the way through the exhibition, and arriving at the stage where AI began to proliferate into hundreds of applications. What enabled AI to be realised? Partly it was the power of modern computing but it was also work conducted by Alex Krizhevsky, who developed AlexNet (software which successfully labelled 15+ million high-resolution images) that got the ball moving.

The link between this development and other outcomes of AI’s influence were demonstrated by an art piece called ‘Myriad (Tulips).’ By Anna Ridler, the art piece on display was just a fraction of the 10,000 pictures of tulips which she photographed and categorised to highlight the human aspect that sits behind machine learning.

If humans influence AI so much, then can we trust those humans to form a fair representation of the world we live in? Can we rely on humans to use the technology for the betterment of the world? Echoing back to part one, many of us are frightened because at its core AI can be seen to represent a side of humanity that we haven’t quite grasped yet.

The data universe

The human influence on AI was explored in great detail in the third part of the exhibition ‘Data Worlds.’ Bringing to the surface AI’s underbelly, this section opened with a cartoon depicting AI in China, where AI not only monitors cities but also keeps track of its population. Later a human intelligent smart home experiment conducted by Lauren McCarthy was explored, where the relationship between smart devices and the private lives of those who use them was shown. Gender Shades by Joy Buolamwini, examined the misrepresentation of race and gender in datasets. All of this conspired to leave me thinking ‘Is AI a bad move for us?’.

It’s reassuring to know that there are some really inspiring people out there conducting research projects that raise these questions. If no questions are asked, and we go full steam ahead, we may end up with a world that we don’t really want. In the concluding paragraph of an article published in The Economist last week, a clause which rung true for me was ‘If problems can be foreseen they can be more easily prevented.’

But as well as being understandably cautious, we should look at the positives that are coming from AI. The final section of the exhibition ‘Endless evolution’ examined AI’s potential to improve our bodies, eliminate disease and even address famine.

The doctor will see you now

Mental health charity Mind has thrown some perspective on the UK’s worry that more and more of us are struggling with our mental health. Apparently the number of people struggling hasn’t changed but it’s the way that we’re coping with it that has gone in a more serious direction.

In order to properly treat mental health we either need a lot more counsellors, psychiatrists and medication or an alternative provided by technology. One section of ‘AI: More than human’ touched on the human need for connection in a progressively digital world with chat bots programmed to be as human as possible communicating with attendees. Experts are already suggesting that AI could help counsel patients and online counseling services such as the Big White Wall and Ieso are already in place in some UK regions.

Furthermore, AI can help doctors to determine diseases early on to prevent life-threatening outcomes. Just this week, Director of Google Health, Michael Macdonnel talked about an early stage AI-powered system which interprets Optical Coherence Tomography retinal images and identifies the signs of sight-threatening disease.

Other companies are experimenting with 3D printing body parts such as Axial3D’s work towards building 3D models of the anatomy using 2D images. The company has already started work on an algorithm which could potentially mean 3D organs become the norm in a hospital near you.

3D printing organs on-demand could potentially save thousands of people.

What’s eating AI?

‘AI: More than human’ also showed a small plant farm nurtured by AI. Small and innocent enough, it echoed plans that are already underway in UK universities for larger farms to begin using smart sensors. These can collect data to provide a greater understanding of crops from a distance so that providing the right fertiliser or amounts of water can be achieved remotely. More judicious use of pesticides can also prevent harm to the soil.

The world’s population is expected to grow from 7.7 billion to nearly 10 billion by 2050. Pitch this against a finite amount of arable land and we need to start thinking about ways to use technology to sustainably produce food, and fast.

Terramera’s Founder Karn Manhas summed it up in an article in Greenbiz earlier this year. He said, ‘Technology such as artificial intelligence (AI), robotics and big data might not be commonly associated with ‘natural’ or ‘health’ movements but actually, these advanced technologies are allowing us to eat cleaner, more locally and more sustainably than ever before.’

Robots picking fruit are helping to close the skills gap as well as reduce food waste. Drone pollinators and self-driving tractors are being developed to help drive efficiency and AI is used to make sense of farm data so that farmers can increase the health of crops, boost yields and ultimately provide better quality, affordable food.

If AI can help us feed the planet, then it’s definitely worth the research.

AI overwhelm

All of this AI in one go was a lot to absorb. It took an AI installation of screens showing butterflies and paintbox colours called ‘What a Loving and Beautiful World’ to round the exhibition off nicely. We could choose to interact directly with the panels, clicking the Chinese calligraphy to influence the space or sit and contemplate the surroundings, in awe of all of the elements combining to create the artwork.

We left asking ourselves the question, “Should we play a passive role in the developments of technology around us or make it our responsibility?”

If AI is to be shaped by human consciousness, then this question should not be asked by attendees of AI: More than human alone, it should be asked across the world.


From Golem to governing society: 'AI: More than human’ review part one

September welcomes the start of another academic year and the media has been busy as usual covering the latest in Science Technology Engineering and Maths (STEM) news. As the skills gap continues to widen, Politics Home reports that primary school teachers are struggling to engage students with STEM subjects. Increasingly, young people have to become responsible for their own development in these areas, dedicating their own time to learn about the latest technologies.

Over the summer we shared with you the list of IET open days taking place across the UK. We hope you and your families got the chance to attend (if you did, please do share with us your experience on Twitter, we’d love to hear from you). To follow our own advice, we also decided to delve a bit deeper into tech over the six weeks holiday and attended the critically acclaimed ‘AI: More than human’ exhibition at The Barbican.

Here’s our Junior Account Manager Rose’s account of the show. Broken into two parts, this is part one:

Sometimes you can see it, other times you can’t, Artificial Intelligence has a habit of sneaking up on us when we least expect it. Whether it’s the use of facial recognition in London’s Kings Cross or the Cambridge Analytica scandal, there are many who are wary of the fast-developing technology, and understandably so.

However, are our fears more to do with how the technology is used, rather than the technology itself? If it’s the former, we need to ask some difficult questions about ethics. Do we trust homo sapiens to implement technology for the greater good of mankind, the planet and other species that live here? ‘AI: More than human at the Barbican’ prompted many such questions. It explored how civilisations across the centuries have worked, albeit sometimes unknowingly, towards today’s rapidly-developing world of advanced technologies. But just as any good exhibition should, it also provided some very interesting answers to how and why the AI revolution has happened and what the future may look like if we continue in the same vein.

For how long have we wanted to create robots?

The exhibition opened with ‘The dream of AI’ and showed how humans have always been curious about the artificial creation of living entities, whether through magic, science, religion or illusion. From the belief in sacred spirits living within inanimate objects in Shintoism through to the Gothic literature of the nineteenth century, the early roots of AI manifest themselves in different ways across various cultures as far back as 400 BCE.

 

Take, for example, the religious traditions of the Golem in Judaism. A mythical figure, the Talbud Jewish holy book says that the golem originated as dust or clay ‘kneaded into a shapeless husk’ and brought to life through complex, ritualistic chants described in Hebrew texts. The above image taken from the artist Lynne Avadenka’s book ‘Breathing Mud’ explores the relationship between sacred letters and the life which is given to the Golem, and by extension to the world. This reminds me of early mathematical diagrams and the coding which is so often used today to program otherwise inanimate objects such as robots.   

Apparently Jewish mystics in Southern Germany made attempts to create a Golem in the Middle Ages and believed this process would bring them closer to God. Is humankind’s fascination with creating artificial life a spiritual exercise after all?

The Uncanny Valley

Later in this section of the exhibition the Gothic tradition of the nineteenth century was cited as significant. Gothic literature such as Mary Shelley’s ‘Frankenstein’ (1823) and Bram Stoker’s ‘Dracula’ (1897) blur the line between the living and the dead and evoke an emotional response of terror - yet people continue to enjoy these novels and the many films and television series that have stemmed from them.

Is it the element of the uncanny within these stories which appeals to us? Sigmund Freud’s essay ‘The Uncanny’ (1919) defines ‘uncanny’ as ‘belonging to all that is terrible - to all that arouses dread and creeping horror’ but it also explains that the ‘uncanny’ is formed when ‘something unfamiliar gets added to which is familiar’ according to English Professor Jen Boyle's interpretation of the text.

Perhaps this is why we get so perturbed by Count Dracula, essentially a human-being with a deathlike twist. Or Frankenstein the great inventor, who made a monster during a scientific experiment  using electricity and human body parts?

These creatures remind us of us - they’re part human, part monster. However, instead of supporting the positive self-image we like to preserve, they actually highlight the darker side of our psyches. They expose the capacity for human beings to become twisted and give in to their innermost desires.

‘AI: More than human’ goes even further in it’s exploration of the uncanny and its relationship to AI. The Uncanny Valley, a hypothesized relationship between the degree of an object’s resemblance to a human being and the emotional response, was demonstrated in a graph (see below). It shows that as the appearance of a robot is made more human, some people respond more empathetically until it reaches a point where it looks too human, for example social humanoid robots, and then people’s responses quickly become strong disgust.  

The Uncanny Valley Graph

Equally, if AI takes on too many human qualities such as empathy, creativity and leadership, many of us become perturbed, which is continually reflected in the news headlines today. 

Mind machines

The exhibition continued with a close look at the technological developments of the 19th and 20th centuries when the belief that rational thought could be systematised and turned into formulaic rules became more prevalent. Ada Lovelace, often considered the world’s first computer programmer, wrote a letter concerning a ‘calculus of the nervous system’ as early as 1844. As a young girl she was a particularly keen mathematician and was taken by her mother to see a demonstration model of the Difference Engine, the first computing machine designed by Charles Babbage. Ten years later she worked with Babbage on the Analytical Engine, a general purpose computer which had a store of 1,000 numbers of 40 decimal digits. The programming language was very similar to that used later by Alan Turing during the specification of the Bombe in 1941.

During WWII, the Bombe was used by Turning to decode messages sent by the Germans. It played a pivotal role in enabling the Allies to defeat the Nazis. It also led to the development of many other computers such as the ENIAC (1946) and the UNIVAC (1951). 

One of the most significant developments in the history of AI happened in 1956 at the Dartmouth Conference, a two-month event organised by computer scientist John McCarthy. Everybody who was anybody in the world of computers attended to work on the problem of how machines make language, process concepts and improve over time. It may not have met everybody’s expectations but it was there that the term ‘artificial intelligence’ was coined. The UK followed with ‘The Mechanism of Thought’ conference in 1958. 

It would only be a matter of 30 years or so before the Golden Era of computer technology began (think Windows 95!) and the first robots constructed by the Massachusetts Institute of Technology (MIT) would be built. Attila was also the first robot that I saw at ‘AI: More than Human’ which for me marked the great leap made by humans from stationery thinking machines to animate digital creatures.  

This was when the exhibition took a turn into the world of AI as we’ve come to know it today. In part two, I’ll explain how ‘AI: More than Human’ showed the many possible benefits of AI such as its potential to eradicate illnesses and produce whole new food groups. It also examined its darker side - the inherent prejudices it can hold and its capacity to ultimately govern society.

Until next time. 🤖 


Virtual Reality and Augmented Reality

The Rise of Virtual Reality and Augmented Reality in Manufacturing

From consumer to manufacturing, virtual reality (VR) and augmented reality (AR) technologies are revolutionising the world today. VR is aiding manufacturers to digitally simulate the product or environment, while AR helps manufacturers to project digital products/ information into the real-world environment. Businesses are now planning their production and assembly processes out in full in a virtual world. In turn, this is used to speed up factory and plant commissioning and operation.

We are seeing big movement, especially in the high-tech industry sectors; but it will be interesting to see how the technology will pan out in small/ medium-sized enterprises (SMEs) later along the line. In this blog, we’ll discuss how VR and AR are impacting how we manufacture today.

 

Virtual Vs. Augment Reality

Virtual reality is currently booming in the consumer market and is easily distinguishable by the big VR headsets that come with it. Once you’ve put on the headset you have immersed yourself in a new digital environment. VR headsets incorporate both visual and audio simulation.

Whereas augmented reality is a slightly different concept that involves transferring a digital interface onto the real world. Augmented reality is more commonly associated with the Pokémon Go app or IKEA’s new feature that allows you to view your chosen sofa or wallpaper into your own home; projecting a digital animation in the real word.

It's clear to see that the industry is embracing VR and AR technologies as a way to display the full abilities of their systems. Last year, BEUMER group, a client of our sister agency Napier, used VR and AR technologies at their exhibition stand. The virtual reality allowed visitors to fully immerse themselves into a real-life example. This VR demonstrated the abilities of their system, from start to finish.

The team also set up an augmented reality that demonstrated the capabilities of BEUMER baggage systems. The augmented reality showcased what the future of technology-lead airports would look like. Read the full blog on the stand we had at the air exhibition show here

 

Design Development

Taking it back to the very beginning, VR and AR are supporting the development of products. We are now using VR and AR to optimise and refine designs from the very start; allowing us to review, adjust, and quickly modify design concepts and ideas before they even go into production. The tools provide capabilities to animate and visualise what is being designed, leading to virtual testing and analysis. With better technologies being utilised this early in the production process, we can expect greater products at the end.

Another benefit of using VR and AR in manufacturing is the virtual product simulations for new products in their development phase. Virtual product simulations are used to make it easy for anyone to understand the look and feel of upcoming products. Which means there is less of a need for everyone in the team to hold a technical background and understanding of complex 2D and 3D models and drawings. Which is an essential ability to have when looking for buy-in during the product development phase.

 

Full Virtual World of Production

We can begin to look at how Virtual Reality and Augmented Reality are affecting the production as a whole rather than for a single product. VR and AR enable businesses to speed up their operations and plan beyond one product, allowing them to plan out their whole production and assembly process in unison in one virtual world. More practically, AR and VR are aiding organisations to maximise productivity by positioning automation lines, production cells, robots, and people.

Rehearsing and training staff is a big task and guess what… AR and VR can do it for you! Younger generations are increasingly preferring interactive based learning. With the adoption of augmented and virtual reality, these game-like style teaching tools becoming a new trend.

 

 

 


Will self-checkouts ever speak like Michael Jackson?

It’s been over a quarter of a century since Marty McFly stepped into the DeLorean time machine to pay us a visit on October 21st 2015. However, things haven’t gone exactly to plan – or the way that Marty experienced it the first time round.

There really aren’t many people using fax machines anymore – instead the go to of communications technology is of course mobile phones. There’s also sadly nowhere to eat where a virtual Michael Jackson will take our order for fajitas. In fact, many of us still prefer to be served by a person at the supermarket rather than wrestle with the self-checkouts.

So how would Marty McFly feel about the technological innovations that we have made? As a technical PR company, we’re always researching and marketing the latest gadgets – we’re desperate to know what brilliant inventions the future will bring to our desks. We guessed that now is the ideal time to find out, and managed to catch up with Marty before he disappeared into an explosion of lasers, fire and smoke.

Ed: Hi Marty McFly, it’s a pleasure to meet you and thank you for coming back to the future to meet us.
Marty: It’s a pleasure to be here. Again.

Ed: Quite. On that note, let’s start by asking, is 2015 how you remembered it?
Marty: No way. It’s way different. Where are all the flying cars and the self-tying shoes? I was looking forward to seeing Jaws 19, so I was gutted to see that it finished at Jaws Revenge – what happened to the rest of them?

Ed: We’ve moved on….
Marty: You call Sharknado progress?

Ed: Ahem. Maybe not. What else is different?
Marty: What happened to all the hoverboards? I at least thought you’d have those by now!

Ed: Well, we’ve got the Swegway…
Marty: Yeah, but you’re not allowed to actually ride them are you? Did you see that story about the cop who got busted for riding one in London, England?

Ed: Very true. We see you’ve discovered hyperlinks though. You completely didn’t see the Internet coming when you first came back to the future did you?
Marty: Yeah, you’ve got me there. I have to say this Internet thing is neat. Much better than the dust-repellent paper the first time I came to 2015.

Ed: You missed out on mobile phones too…
Marty: True, but we did have flying cars, self-tying shoes and hoverboards. Did I mention that?

Ed: You might have. OK, so what DID you actually get right?
Marty: Well, we had something very like your Google Glass devices, with things like built-in cameras and even something very much like your Interweb…

Ed: It’s Internet…
Marty: Yeah, Internet. Plus we also had plasma screen TVs, 3D movies and even video calls, which I think you guys call Skype? As I’ve been here once already, I’ve had a good 25 years to teach my parents how to use it. 

Ed: Useful. So apart from flying cars, self-tying shoes and hoverboards, what one thing are you surprised we don’t have in our version of 2015?
Marty: Well, I think automatic dog walkers were a pretty great idea. And self-drying jackets, which would save you guys a fortune on tumble-drying. There were also remote control litter bins that could empty themselves.

Ed: Aha! Now that we might be able to do. There’s a plan to use the Internet of Things in Milton Keynes so that dustbins can tell the local council when they need emptying.
Marty:  Err yeah, that’s great. Real cutting-edge stuff. Let’s hope you find some other things to do with it too.

Ed: Like what?
Marty: No idea. Hey, how about I go find out and come back and tell you in 25 years.

Ed: Any room in that time machine for a passenger?
Marty: Nope. Sorry. Flux capacitor is taking up all the room. Cheerio. See you on 21st October 2040.

Ed: Give our love to Jennifer and the kids. And say hi to Doc for us. Oh, you’ve gone…

It would seem that Marty wasn’t all that enthralled with the advances we’ve made or perhaps he was feeling slightly bitter that he didn’t predict the rise of smart phone technology and social media…

If you’d like to experience just how far we’ve come since the decade of Back to the Future’s conception just take a look at how clunky the original Macintosh was in 1984 and then take a look at your iPad. Weighing 16.5 pounds, the original Mac had just 128k of memory and was cited for user-friendly developments such as its pull down menus, mouse and icons.

The first mobile phone also made its debut in 1984, weighed nearly 11 pounds and needed a car to be charged. We like things much lighter in 2015, and with the average memory of a smartphone at 16GB (that’s one thousand million bytes in comparison to the original Mac’s one thousand) we can only begin to imagine what the capabilities of mobile technology will be in another 25 years’ time.

In fact, the closest way of experiencing the destabilising effect of time-travel is to visit the National Musuem of Computing in Milton Keynes. Here you can awe at the size and mass of the first laptops as well as the first stored-program digital computer of the 1950s, the ‘WITCH’. We highly recommend a visit, but in the mean-time we’ll be waiting here patiently for Marty to come back and report from the future.


Roll up, roll up, we have an opening for a sparky new PR exec...

Are you searching for a job in technical PR? Today's your lucky day as we have a position we need to fill as soon as possible. In fact, we are always on the look out for talented, bright and enthusiastic individuals, so have a go at the quiz below to find out if you'd fit right in here at Armitage Communications.

In order to take the quiz you will need to either sign up or log in via your Facebook account. You can also log in with the details [email protected], password: Armitage.

If you're a technical wizard or technical apprentice, please get in touch and email [email protected] with your C.V, quiz result and 200 words as to how you'd argue that tech PR is actually a fascinating, mentally-stimulating and up to the minute industry. Technical errors, we're sorry but our advice is to go back to the drawing board. Tech PR probably just isn't right for you.

But hang on, if you're on our blog then something must have taken your fancy...?


Technology, the frenemy of tomorrow's workforce

Technological change seems to occur at the speed of light. As one technical innovation develops, another is just beginning. It can be extremely difficult to keep up, even for those twenty-something millennials who seem to have the advantage on Generation X.

Yet as more and more jobs are absorbed by technology, young people are struggling to find employment. And if this is the case for Generation Y, what will the job market be like for Generation Z?

In a recent report by the World Economic Forum, persistent jobless growth was rated globally as the second highest concern. Larry Summers, former US treasury secretary, placed responsibility on the education sector to “meet the needs of this age.” He warned that if current trends continue then whole sections of society will find their standards of living going backwards.

Of course, it’s important to remember that doom and gloom shifts newspapers. While it’s inevitable that jobs will be lost to technology, this is not to say that new opportunities are not already cropping up in their place.

From manufacturing through to media, developments in technology are opening up a raft of new opportunities. Even the PR and marketing industry has seen a seismic change in job roles to keep pace with the exciting possibilities around social media.

Getting ready for work

The best safeguard against being replaced by technology is knowing how to use it. If we want today’s students to enjoy a brighter future, we need to make sure that they go into the workplace fully able to use technology to maximum effect.

Why not start implementing curriculums that prepare for certain roles, such as digital marketers, social media managers and software engineers. Why not include blogging in English lessons? What about robotic engineering in Design Technology? What about getting schools to start trading with one another? The possibilities to digitise the workforce of tomorrow are endless.

The UK’s Year of Code is one of many signs that reform has already started. Including a new initiative to train teachers in software coding, it’s hoped that the scheme will encourage these new skills within the classroom and, further down the line, technology entrepreneurship. In fact, the government has ordered that HTML coding become a compulsory topic covered for every child aged 5 – 16 years old.

There’s also a growing network of University Technical Colleges (UTCs), government-funded schools that teach students technical and scientific subjects, educating the inventors, engineers, scientists and technicians of tomorrow. Perhaps more schools should take a leaf out of their book.

In a study by Deloitte, 84% of London businesses said the skill set of their employees will need to adapt over the next decade. Expertise such as ‘digital know-how, ‘management’ and ‘creativity’ were most desirable. Indeed, at 634th in the list of careers most likely to be overtaken by technology we like to tell ourselves in the PR industry that we’ll be completely fine, at least for the foreseeable future.

The truth is that whether we’re young or old, the demands of today’s workplace mean we all need to keep up-to-date on how to make best use of the technological advancements of the 21st century.

What are your thoughts about technology and the job market? Are we prepared? What can we do to give children the best hope of a successful career in the future?