Subscribe for our latest future of work insights

WorkingTheFuturelogo-long1200px

twitter
linkedin

Privacy policy | © Working the Future Ltd 2019

Working the Future

Our future of work blog 

02/04/2019, 13:19



MANNERS-COST-NOTHING


 ‘Ghosting’ and ‘airing’ are now seemingly mainstream behaviours; both terms now firmly embedded in 21st Century vernacular. According to the New York Times, ‘orbiting’ is the new kid on the block...



’Ghosting’ and ’airing’ are now seemingly mainstream behaviours; both terms now firmly embedded in 21st Century vernacular. According to the New York Times, ’orbiting’ is the new kid on the block.  (For the more curious among you, this 2016 The Independent article has opened my eyes to a whole new lexicon. Who knew??) 
  
Our personal relationships have been transformed by smart phones and social media, giving rise to a new and apparently commonly accepted set of behaviours that are nonetheless distressing for those on the receiving end. A 2015 article in Psychology Today revealed that approximately 50% of men and women have experienced ’ghosting’ and as many have ’ghosted’ someone else. Wow. 

Now, it seems, these behaviours are starting to permeate the workplace. 

What does this say about 21st Century society? For us, it suggests that the rise of smartphones and digital culture have created virtual landscapes where it’s much easier to hide behind screens and avoid ’hard’ conversations. "It’s not you, it’s me" was always difficult, and yet the act of providing closure is so much kinder than simply evaporating into the virtual ether. 

Our digital landscapes are rapidly changing how we interact with one another as humans. We’ve profiled the work of Sherry Turkle previously; Sherry is a Professor at MIT and her specialist area is the psychology of human relationships with technology. Her 2015 book Reclaiming Conversation argued that digital natives, those who’ve grown up post-internet and smartphone, are relinquishing face-to-face communication in favour of online interaction. Face-to-face communication is quite simply too hard, requiring emotional resilience to deal with the spontaneity of human dialogue and also to manage unedited representation - no opportunity to edit or photoshop real-time conversation.

Digital communication allows us the (unacceptable) excuse not engaging in any human interaction that is any less than baseline transactional, but faced with the proliferation of workplace automation and AI, this has never served us less.

I was recently invited to a meeting with a new client to discuss a new project. The first meeting was postponed with less than two and half hours’ notice. It was rearranged, only for the client to text, ten minutes after the allotted meeting time, saying that he was unwell. True story. While I’m sure there’s some plausible explanation for this, it felt incredibly unprofessional and tarnished our view of that organisation. The actions of one person can have a resounding impact.  

A recent Washington Post article reported workers ghosting employers - quite simply, they stop turning up to work and fail to even message in a resignation. Other research points to a growing trend among Generation Z to simply disengage from hiring processes when they feel they’re not been engaged with in the right way. Whichever way we skin it, that these behaviours are now being documented as more than one off instances, suggests our basic human capability to communicate has been massively impacted. 

********************** 

Our human skills, including social-awareness, empathy and communication are key to remaining relevant in the future of work. Tech firms are already competing to develop algorithms that will reduce, or worse, eliminate, the need for humans in the workplace. Many tech evangelists argue that we’re a hair’s breadth away from technology being able to emulate human empathy and reasoning, which, while we strongly disagree with the viewpoint, means that work is already underway to eventually render humans irrelevant in the workplace of the future. 

Human skills include the ability to show up and engage in hard conversations - conversations that will inevitably challenge us with their emotional complexity; it takes huge courage to engage in communication that may cause offence if handled the wrong way. It seems now however, that our ability to have any kind of hard conversation is being undermined by an inability to demonstrate even bare-bone levels of human courtesy and respect. 

But we CAN and MUST communicate with empathy, if we are to remain relevant in the future of work. When we allow ourselves to be vulnerable in our social interaction, such conversations are not only possible, but hugely rewarding emotionally.  Brené Brown, whose work on vulnerability and courage has gained global accolade, has focused her latest book, Dare to Lead, to the idea that the ability to have hard conversations will determine leadership success or failure in the future of work.   

Human relationships have always been messy. They can be tough and immensely painful, especially when they don’t work out. But the flip side is that they provide deep joy and a sense of connection that is absolutely critical to our sense of wellbeing. Attempting to "swipe-left" on human interaction does us no service at all, neither at work, nor in our future ability to thrive as a species.  

For thousands of years, the way that humans trade with one another has been contingent on trust and robust human relationships. Workplace automation and AI will provide transactional efficiency; the huge commercial opportunity for business now is to leverage human skills to provide deeply enriching client experiences that embed loyalty in increasingly fluid landscapes. We predict that professional empathy is about to go large.  
05/03/2019, 12:01



SURVIVING-AND-THRIVING-IN-THE-ATTENTION-ECONOMY:-THE-CASE-FOR-‘ATTENTION-HYGIENE’


 The first time Pat and I started thinking about ‘Attention Hygiene’ was in March 2018. We were discussing our learning goals for the year and Pat mentioned a frustration that he was finding it increasingly hard to focus...



The first time Pat and I started thinking about ’attention hygiene’ was in March 2018. We were discussing our learning goals for the year and Pat mentioned a frustration that he was finding it increasingly hard to focus; he felt perpetually distracted. 

This kickstarted a conversation about mindfulness practice and its benefits for enhanced creative thinking, something we know will be an essential human skill in the future of work. We talked about how to create ’better’ work habits, and then promptly changed the subject. 

Some months later, I started reading The Shallows by Nicholas Carr. I was stunned by his observation in the opening pages of the book: 

"I’m not thinking the way I used to think... I used to find it easy to immerse myself in a book or a lengthy article... That’s rarely the case anymore. Now my concentration starts to drift after a page or two. I get fidgety, lose my thread, begin looking for something else to do.  I feel like I’m always dragging my wayward brain back to the text..." 

I couldn’t believe it; it was as if Carr had been listening in on our earlier conversation. The book was illuminating, and in so many ways helped me understand that the feeling of constant distraction wasn’t a failing of the middle-aged mind, but rather a by-product of the internet and the way it has pervaded our human environment to become a permanent feature via the ubiquitous nature of our ’always-on’ devices. I was keen to find out more.

+++++++++   

We’re at a fascinating crossroads in human history. In 29 short years, the internet has gone from the world-wide-web concept imagined by Sir Tim Berners Lee, to the single most used tool that helps us navigate our daily lives. According to a Daily Telegraph article published in August 2018, the average UK person spends 24 hours online per week, with one in five adults spending more than 40 hours a week surfing the internet. Such vast access is undoubtedly enabled by the proliferation of smart devices and high-speed broadband, but its staggering to think that we are, as a population, typically spending more than half the time we would usually spend at work, scrolling through newsfeeds, social media and whatever else takes our fancy. 

There’s a reason however why we’re spending so much time online. The design methodology behind the most popular online platforms, Facebook, Twitter, Instagram, Snapchat et al, is starting to be made public by those who, involved in the early years of development, are now questioning the moral integrity of the code they originally wrote. It’s an alarming story. 

Lured by the potential of digital advertising revenues, algorithm designers have spent the last 15 years building an experience specifically intended to have us spend as much of our time online as possible. The more time we spend online, the easier it becomes for technology platforms to learn our habits and behaviours.  The more that’s known about the way we behave, the easier it becomes to ’micro’-target us, with adverts tailored towards the very things our web surfing behaviour suggest we will like. More recently, this has taken an ominous twist with ongoing revelations about Facebook, in particular, being used as a vehicle to influence our core value systems and political beliefs. We’re learning, in real-time, how this has played out both across the UK in the Brexit Referendum vote, and in the 2016 US presidential election. The voting systems of other countries have also been compromised, but as yet, with less alarming outcomes. 

As James Williams has written in his 2018 book Stand Out of Our Light:

"There are literally billions of dollars being spent to figure out how to get you to look at one thing over another; to buy one thing over another; to care about one thing over another. This is literally the design purpose of many of the technologies you trust to guide your life every day."

The size and scope of the attention economy cannot be understated; it is literally jaw-dropping.  Online advertising is already one of the largest segments of the global economy, and forecasters anticipate further huge growth by the early-mid 2020s. The online advertising market has been valued at a compound annual growth rate of approximately 33% between now and 2023, when it is forecast it will be worth 21.6 billion US dollars. 

Consider another data point from Williams, who, as a former software design engineer for Google, we reason is better placed than many to comment: 

"Each day, the Android mobile operating system alone sends over eleven billion notifications to more than one billion users." 

Marry this with a data point from Adam Alter’s 2017 book Irresistible

"70 percent of office emails are read within six seconds of arriving...This is hugely disruptive: by one estimate, it takes up to twenty-five minutes to become re-immersed in an interrupted task." 

It turns out that our human brains are hardwired for distraction. A 2007 scientific report revealed that our brains respond faster to distraction than they do to the sustained effort of paying attention. This is the basis upon which the various online platforms that consume so much of our time and attention develop new products, services and features. 

I don’t know about you, but just becoming aware of this premise made me feel a) vindicated, in so far as I finally understood why I was perpetually struggling to focus on tasks through to fruition, and b) enraged; bluntly - I felt as if I’d been hoodwinked, and as if my own personal value system had somehow been compromised. 

+++++++++ 

It strikes us as (worryingly) paradoxical that, at the precise moment in human history where we need our wits about us most, when the essential skills of the 21st Century will be embedded in an ability to innovate and think critically, most of us are sleep-walking through life, aware at some visceral level that something isn’t quite right, but yet unable to put our finger on it.  If we wanted to go conspiracy theory, we might argue that the very technocrats who are manipulating our attention and focus for financial gain, are the same elite few who are now telling us that our future lives will be work-less, as their own technology will automate all key aspects of our commercial environments.

Creating a better and more human-centric future of work will involve significant reflection, critical thinking and an abundance of focus; designing and implementing fluid workforces that successfully leverage competitive advantage will require considerable self-discipline, particularly in the area of ’attention hygiene’. We’ll need this both for ourselves, as change-makers, and also for those whose lives we are working to improve for the better. 

This isn’t an easy path. When I deleted Facebook last year, I was surprised to experience, alongside a sense of relief (that I was finally free of the increasingly narrow algorithmic echo-chambers), a parallel sense of bewilderment. Frankly, it took at least a fortnight for my brain to adapt, and in the meantime, I felt as if I was missing something intrinsic - such was the extent to which mindless scrolling through endless feeds of irrelevant status updates had become ingrained in my life. 

I recently read Tools of Titans by Tim Ferriss. The book is a collection of the habits and routines of some of the most successful people in the world, across sport, business, finance and other domains. What I felt was most striking was the extent to which so many high-performing individuals have made meditative practice part of their daily routine. It seems that while mindfulness and meditation are being widely regaled for the therapeutic calm they bring to the busyness of 21st Century living, there’s a deeper (and certainly more relevant to this discussion) benefit. 

In his 2014 bestseller Focus Daniel Goleman writes: 

"It takes meta-cognition - in this case, awareness of our lack of awareness - to bring to light what the group has buried in a grave of indifference or suppression. Clarity begins with realising what we do not notice - and don’t notice that we don’t notice."

+++++++++ 

Preparing for, and adapting to, a very different future of work will take an abundance of focus, reflective thought and critical analysis. We cannot possibly adapt to a new way of doing and being in the world when we are only half-focused on what’s at stake. 

The role of Working the Future, and indeed, of others like us who recognise the opportunity for designing a more inclusive and balanced work future, is to raise awareness and help others navigate their way through complex change. We can’t do this with our eyes half closed. 

"Being in survival mode narrows our focus," writes Goleman in Focus. With so many of our societal norms currently in freefall, the common human response is that of survival mode, focusing only on what’s important right now.  Those of us who want to make a difference and create a better future of work, must keep paying attention to paying attention, and in spite of their apparent size and scope, not allow the digital giants to get the better of us. There is important work to be done.       

References 
1.     Carr, N (2010). The Shallows: How the Internet is Changing the way we Think, Read and Remember. London: Atlantic Books 
2.     Williams, J. (2018). Stand out of our Light: Freedom and Resistance in the Attention Economy. Cambridge: Cambridge University Press 
3.     Alter, A. (2017). Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked. New York: Penguin Random House 
4.     Goleman, D. (2013). Focus: The Hidden Driver of Excellence. London: Bloomsbury 
5.     Ferriss, T. (2016).   Tools of Titans: The Tactics, Routines and Habits of Billionaires, Icons and World-Class Performers. London: Penguin Random House 
6.     Global online advertising market value and forecast growth stats - reportsherald.com, January 2019
31/01/2019, 10:43



THE-FUTURE-CIO


 We’re in the midst of game-changing transition. Technology has, in recent years, transformed how we behave both at work and at home, and Moore’s Law argues that that the pace of technological progress will only ever speed up from here on in.



Today I offloaded some CDs at the charity shop. I was asked what condition they were in. Since I haven’t owned a CD player in over 5 years, I said that I thought they were good, but that they’d need to check themselves. This led to a conversation about the pace of technology now. It’s mad that streaming has rendered the CD and DVD player obsolete. Perhaps I’m just old now. 

We’re in the midst of game-changing transition. Technology has, in recent years, transformed how we behave both at work and at home, and Moore’s Law argues that that the pace of technological progress will only ever speed up from here on in.  

We know that commercial landscapes are set to become infinitely more technology-centric - digital transformation promises untold efficiency to improve both profitability and customer experience. 

Given the pace and scope of technology innovation, however, how does a CIO remain on top of an accelerating digital landscape? 

As little as five years ago, a CIO CV would demonstrate the skills and experience gained in a highly bespoke commercial environment, with much of the IT infrastructure stack designed, built and managed in-house. Today, from an ROI perspective, low-cost PAYG technology options challenge the business case for building in-house. It’s far more commercially appealing to take advantage of the myriad cloud-based business offerings. 

Furthermore, the range of technology available across every aspect of both commercial and consumer life has changed the very relationship that humans have with tech. Its pervasive nature makes most users far more willing to experiment with apps and readily available web-based tools that improve the way we do things. More than ever, it’s now the height of cool to be a "geek" and an early adopter of emerging technology. It used to be trainers; now it’s gadgets. 

How we engage with technology in our personal lives transforms our expectations of it in the workplace. Younger cohorts expect technology to be instantly accessible and to be able to access the applications they want, whenever and from wherever. BYOD is old-school, we’re now in the age of BYOA - "Bring your own Anything."  These younger cohorts will move jobs if they feel career progression is hindered by a limiting technology environment.

So, what can an IT leader do? He or she is faced with countless technology options, a boardroom mandate to crack on and use technology to create commercial advantage, faster than the competition, at reduced cost, and of course while maintaining a robust and secure 24/7/365 uptime environment. No pressure. Total exposure. 

We can ponder the lexicon of this brave new landscape, and perhaps consider hiring a Chief Digital Officer alongside or instead of the traditional CIO. But this is just tinkering. 

Limitless choice renders it impossible to retain expert status across the technology piste. As Kevin Kelly wrote in The Inevitable

"Endless Newbie is the new default for everyone, no matter your age or your experience." 

To remain ahead of the game in technology leadership, the royal jelly of future success will be a constant figuring out of how to maintain a stable yet constantly evolving technology environment where all data is safe and secure. It will be a constant evaluation of where to go next, what to allow, what to dismiss, all the while continuously justifying your arguments in front of an audience that a) wants immediate access to quick fixes and b) doesn’t always care about the business imperative of safety, security and governance. 

This requires a new set of human skills. It requires pragmatism, dexterity, advocacy and above all, the tenacity to hold steady in the wake of constant disruption. Your new world landscape will be like constantly trying to complete a 5000-piece jigsaw when you can’t even find the edges.  

The pre-eminent skill for the future CIO will be the humility to recognise that it’s no longer humanly possible to keep on top of all the emergent technology options in micro-detail. That would be like trying to drink from a fire hydrant.

Equally, trying to inhibit the creeping uptake of the latest business applications within a commercial environment is a waste of time, and will arguably be perceived as an outdated attempt at power and control. 

Far easier would be to assume the role of advocacy and education - asking colleagues to consider the following before being lured by the next time-saving or experience-improving application. 

1)     Is the application secure? 
2)     Will it compromise the business in any way? 
3)     Does it pass on any proprietary data to 3rd parties (particularly pertinent in the case of any low cost / no cost applications) 
4)     What is its likely shelf-life? 
5)     How easy would it be to migrate away from this application if it becomes unworkable for any reason? 

Having users ask these questions creates an environment where people are thinking more holistically (and sensibly) about new technologies, and demonstrates leadership capability that transcends the 20th Century paradigm of hierarchy, power and control. 

The future of work is fluid; technology will irrevocably change the nature of organisational structures and of work itself. Leaders who have the self-awareness to adapt to a collaborative approach, working with all stakeholders in true partnership to embrace continuous learning are, to our mind, far more likely to be successful than those who maintain a hierarchical (and perceivably authoritarian) stand in an effort to maintain status and control. 

References: 
Kelly, K. (2016) - The Inevitable:  Understanding the 12 technological forces that will shape our future. New York: Penguin Random House   

Inspirations: 
Conversations with several forward-thinking CIOs across the past few months have informed this blog. Guys - you know who you are. Thank you.


12311
Create a website