I wholeheartedly agree with the LinkedIn article’s author, Bruce Anderson, who began by noting, “Many of the jobs that today’s students will be doing in 2030 haven’t been invented yet.”
All we need do is reflect five-to-15 years back and realize that many jobs we have today—specialist roles in social media, search engine optimization (SEO), cybersecurity, blockchain, electronic vehicles, drones, and more—didn’t exist or were extremely rare. And it’s not just in technology and product development areas—while the role existed, how many organizations had a chief diversity officer in 2014—just eight years ago?
Some of the job titles Anderson proposes are intriguing and do seem likely to be more common by 2030:
- Organ Creator (scientists who create bodily organs from stem cells and other materials)
- Trash Engineer (specialists in upcycling manufacturing by-products to reduce waste)
- Autonomous Car Mechanic and Drone Traffic Optimizer (both exactly what they sound like).
Others are harder to predict because they depend on the uptake speed of key consumer-facing technologies (e.g., augmented-reality journey builder, metaverse planner, and digital currency advisor).
The 2021 article from the WEF included two jobs that are already significant today, only one year later: work from home facilitator and workplace environment architect. Our biweekly i4cp call series, Getting Hybrid Work Right, has made clear just how seriously many organizations are taking flexible, hybrid, and remote work models, and the importance of intentional workplace design—enough that many already have people in roles to lead policy and culture in these areas.
Similarly, the list in Anderson’s LinkedIn article includes two jobs that are also very much people-related and I think critical to hire for much sooner than 2030—perhaps even as soon as now: algorithm bias auditor and human-machine teaming manager.
Algorithm bias auditor
Algorithms and artificial intelligence (AI) are seemingly everywhere in our lives today. And this includes in business processes, such as learning content recommendations and job matching platforms, both for external candidates in an applicant tracking system (ATS) and for internal candidates in internal talent / opportunity marketplaces.
The use of algorithms and AI can sometimes reduce bias in a business process, by lessening the role of subjective human decision making at some points. But since these technologies are themselves created and tweaked by human software engineers, they can have biases built into them—ones that are sometimes far less obvious than when a human makes a biased decision in the moment.
Savvy organizations won’t wait until 2030 to have someone responsible for bias-auditing all algorithms, machine learning, artificial intelligence, etc., used in the organization—from hiring, to predictive people analytics, to learning/training recommendations, and anywhere else that a bias in one direction or another could have a negative impact.
At i4cp we’ve long recognized the need to conduct bias audits across functions and activities. So, we created a series of detailed bias audit checklists for i4cp members that cover talent acquisition, learning and development, performance management, total rewards, succession planning, and even hybrid and flexible work. This series also includes a Bias Audit Checklist for AI and Algorithms, a great resource for organizations whether they have someone in the role of algorithm bias auditor yet or not.
Human-machine teaming manager
The issue of bias in algorithms and AI was raised in i4cp’s 2019 study on advanced work automation. An aspect of that study was captured in the title of the report: Automating Work: The Human/AI Intersection. Sometimes new technologies—automation and artificial intelligence included—can be used to replace human labor to reduce costs, or increase quality or consistency. But more often we experience an intersection of humans and their skills with machines, and especially digital, technologies. Employees’ capabilities are more often augmented by the smart application of machine learning, artificial intelligence, robotic process automation, and so on.
But there are many steps involved in doing this effectively, and these can vary for different types of roles and technologies. High-performance organizations are more often taking a holistic approach to identifying where AI and automation can be helpful, piloting the use, and then if successful rolling out the changes at scale in a way that is understandable and positive for all involved. Doing so enables the organization to reap the benefits of the new technology without unintended consequences along the way, such as employee concern that they might lose their jobs vs. becoming more efficient and valuable to organization in the end (which is far more often the case).
At i4cp, we’ve woven these considerations into our Seven Steps of Workforce Planning process. But as the LinkedIn and WEF articles noted, larger organizations can benefit from having someone in the role of human-machine teaming manager, to really drive the benefits of augmenting employees’ capabilities as deeply and widely as possible with the latest technologies, and doing so in a consistent way across the organization.
As Anderson noted, the best candidates for such a role will have a background in both experimental psychology or neuroscience paired with work in computer science, engineering, or HR. Changes are happening so quickly, I’d be looking for people with this combination of skills now, not by 2030.
For more insights on hybrid work join our biweekly call-series Getting Hybrid Work Right hosted by myself and i4cp, CEO, Kevin Oakes.
Tom Stone is a Senior Research Analyst at i4cp