Home / artificial intelligence / Forcing teams to use AI in the workplace can dampen their pre-existing enthusiasm for the technology

Forcing teams to use AI in the workplace can dampen their pre-existing enthusiasm for the technology

Teams with a pre-existing positive inclination towards AI experienced a dampening effect on their enthusiasm to collaborate when forced to use AI technologies, a new study has found.

Researchers also found that teams with initial negative views about AI were in fact more likely to collaborate with AI when forced to use it. The research, which has been published in the academic journal Group & Organization Management, underscores the importance of trust in AI and how employers should introduce the growing technology into their businesses.

IÉSEG School of Management, Europe’s leading international business school, has released the study examining the factors influencing the adoption of artificial intelligence (AI) within teams. IÉSEG School of Management Professor Evangeline Yang, in collaboration with researchers from the University of Buffalo, Rutgers University, and Simon Fraser University, examined teams’ attitudes towards AI and its implementation within organisations.

Titled ‘Why Teams Adopt AI or Not: Insights into the Use of AI in Teams,’ the study offers an understanding of how teams are adopting artificial intelligence into their businesses and how it is being introduced into team operations.

Researchers looked at how individuals and teams interact with AI by conducting an extensive review of top-tier journals, exploring attitudes towards AI and other new technology and their implementation strategies. They found that when AI adoption was voluntary instead of being forced, teams with positive perceptions of AI demonstrated an increased inclination to use AI while teams with negative perceptions were even less likely to adopt AI technology.

Professor Yang’s research highlights the importance of assessing employees’ attitudes towards AI before making decisions on how to adopt it within an organisation and whether the adoption of the technology is forced upon teams or is voluntary.

She believes organisations should gauge employees’ attitudes before taking a decision on how to adopt it via a survey or questionnaire.

Professor Yang said: “As a HR researcher I was particularly interested in the human side and the way individuals and teams interact with AI.

“The study highlights the importance of feelings and trust in AI, when implementation and whether to adopt it is left to the team.”

The study offers practical implications for organisations looking to leverage AI technology effectively and optimise performance of their teams.

More information is available in the full paper.

Webinar: How to create an ethical supply chain in FM

Are you confident that your contractors and suppliers meet ethical labour standards and human rights obligations?

For many FM services, managing multiple third parties in the supply chain can be a complex challenge. And, in a sector that relies heavily on migrant workers and 65 per cent of FM services facing difficulties in sourcing workers (Q4 2022 RICS survey), exposure to exploitation and modern slavery is a very real risk. With the UK one of the biggest destinations in Europe for trafficking of workers, you need full visibility of the people you work with, so you can minimise your risks and identify which part of your supply chain is most vulnerable.

FMJ, in partnership with Alcumus, is pleased to bring together a panel of experts to discuss how to create an ethical supply chain in FM.

Taking place at 11am on Wednesday 21st of June, the webinar will discuss:

  • The regulations to be aware of that are aimed at preventing human right abuses.
  • How to gain greater visibility of modern slavery compliance in the supply chain by having a robust verification process in place.
  • Steps to creating a compliant, ethical supply chain.

To register for the webinar click here.

About Sarah OBeirne

Leave a Reply

Your email address will not be published. Required fields are marked *

*