DOWENEED EXPLAINABLE AIINCOMPANIES INVESTIGATION OF CHALLENGES EXPECTATIONS AND CHANCES FROM EMPLOYEES PERSPECTIVE

2025-04-27 0 0 413.49KB 11 页 10玖币
侵权投诉
DOWENEED EXPLAINABLE AI IN COMPANIES?
INVESTIGATION OF CHALLENGES, EXPECTATIONS,AND
CHANCES FROM EMPLOYEES’ PERSPECTIVE
Katharina Weitz
Chair for Human-Centered AI
University of Augsburg
Universitätsstraße 6a
86159 Augsburg
katharina.weitz@uni-a.de
Chi Tai Dang
Chair for Human-Centered AI
University of Augsburg
Universitätsstraße 6a
86159 Augsburg
chitai.dang@uni-a.de
Elisabeth André
Chair for Human-Centered AI
University of Augsburg
Universitätsstraße 6a
86159 Augsburg
elisabeth.andre@uni-a.de
ABSTRACT
Companies’ adoption of artificial intelligence (AI) is increasingly becoming an essential element of
business success. However, using AI poses new requirements for companies and their employees,
including transparency and comprehensibility of AI systems. The field of Explainable AI (XAI)
aims to address these issues. Yet, the current research primarily consists of laboratory studies,
and there is a need to improve the applicability of the findings to real-world situations. Therefore,
this project report paper provides insights into employees’ needs and attitudes towards (X)AI. For
this, we investigate employees’ perspectives on (X)AI. Our findings suggest that AI and XAI are
well-known terms perceived as important for employees. This recognition is a critical first step
for XAI to potentially drive successful usage of AI by providing comprehensible insights into AI
technologies. In a lessons-learned section, we discuss the open questions identified and suggest future
research directions to develop human-centered XAI designs for companies. By providing insights
into employees’ needs and attitudes towards (X)AI, our project report contributes to the development
of XAI solutions that meet the requirements of companies and their employees, ultimately driving the
successful adoption of AI technologies in the business context.
Keywords Explainable AI ·Human-Centered AI ·User Evaluation
1 Introduction
AI applications have already impacted our private and work life, for example, voice and face recognition in smartphones,
chatbots, or cobots in industry. National and international companies are aware of the impact of AI technology on their
success and innovation potential. For example, European companies expect a sales increase with the help of AI of about
5 million US-Dollar until 2025 [
1
]. At the same time, legal regulations are demanding more and more that these AI
technologies need to be comprehensible and transparent [
2
]. However, these requirements are not inherent, for example,
in Deep Neural Networks (DNN), often referred to as black-box models.
The research area of Explainable AI (XAI) has dedicated itself to closing this gap and providing comprehensible
explanations of black-box AI systems. Gunning et al. [
3
] see the motivation of XAI in providing comprehensible
explanations to humans. Therefore, DNNs become explainable when their inner workings or decisions are described so
humans can understand them [
4
]. Besides this, researchers claim more and more that one explanation does not fit all,
and demand that XAI needs to be personalized, depending on the stakeholder and the application scenario [
5
,
6
]. The
popular opinion is that XAI has different relevance for different stakeholders [
7
]. Therefore, Schneider and Handali
[
7
] highlight the importance of collecting and investigating data and information of explainees (e.g., machine learning
knowledge, preferences, prior experiences).
The overview of Langer et al. [
8
] show that most of the scientific work in the field of XAI is done without empirical
investigation of stakeholder. Langer et al. [
8
] highlight the importance of investigating stakeholders’ desiderata to
arXiv:2210.03527v2 [cs.HC] 2 Jun 2023
XAI for Companies? Challenges, Expectations, and Chances
Figure 1: Overview of the content of this paper. We describe the results of an online survey conducted in companies
that asks employees about the company context and their perspective on (X)AI. Based on the results, open questions are
discussed and future research directions to incorporate employees’ requirements in XAI design are suggested.
develop and adjust XAI approaches. However, they also point out that little research currently identifies, defines,
and empirically examines stakeholder desiderata. Instead, the small amount of conducted experiments investigates
the impact of explanations on users in different domains such as healthcare (e.g., [
9
]), education (e.g., [
10
]), and
production work (e.g., [
11
]). This research primarily focuses on laboratory experiments and gives first impressions
and understandings about the impact of XAI on users referred to as human-grounded evaluation [
12
]. The insights of
these lab studies show that users’ attributes and the AI application’s characteristics must be considered when designing
XAI [
13
]. Nevertheless, studies need to pay more attention to the demands of real-world applications. Kraus et al.
[
14
] investigated the impact of XAI in economically relevant ecosystems (e.g., healthcare, financial and manufacturing
sector, construction industry). For this, they conducted an online survey and interviews with relevant stakeholders. The
focus of their work lies primarily in the investigation of the helpfulness of different XAI tools for application-specific
use cases. However, stakeholders’ needs and attributes must also be considered when designing XAI [
15
]. Company
employees’ needs and attitudes regarding (X)AI must be clarified. Our study approaches this issue by asking employees
about their attitudes towards (X)AI
1
and their impressions of AI deployment in their company. Therefore, our paper
investigates employees’ perspective on (X)AI by using an online survey: (1) to understand the context of their work
with AI, we investigated the employees’ perspective of AI used in their company, including an overview of the current
AI applications used, as well as (2) the personal perception of employees towards AI and XAI (see Figure 1).
Based on the results, we discuss open questions in the lessons learned section. More concretely, this paper contributes
the following:
It provides employees’ views about their companies’ usage of AI technology and their perception of XAI.
It gives an overview of the challenges and risks of AI technology that employees perceive and provides
impressions of users’ needs.
Finally, it concludes lessons learned where we discuss open questions identified that serve as a basis to
investigate requirements for human-centered XAI designs in companies and industries.
1(X)AI refers to AI and XAI
2
XAI for Companies? Challenges, Expectations, and Chances
2 Online Survey
2.1 Procedure
Our goal was to identify the present state of (X)AI-related issues and potential in companies from an employee’s
perspective. To achieve this, employees of companies of different sizes and sectors were asked about AI technology’s
current and future development in their company through an online survey. We distributed the questionnaire through
multipliers of the Plattform Lernende Systeme/acatech
2
(e.g., chambers, competence centers, corporate leaders) to
cover a broad portfolio of companies and their employees. The questionnaire was composed in German and aimed at
employees of German-based companies.
2.2 Research Questions
We asked each employee about the current status and the strategic planning of using AI systems in their company. For
this, we formulated the questions of the survey to address: (1) a broader view on AI in the company context and (2) the
perspective of employees about (X)AI
3
. Since we want to investigate employees’ perspectives as they interact with a
(future) AI system in the company, it is important to note that the survey reflects the employees’ subjective perception,
not the company’s slogan.
Company Context The company perspective may generally serve companies that do not yet, hardly, or already use
AI technologies for further strategic orientation and planning. For example, what are company motivations, usage areas,
or issues regarding AI? Here, the experiences and decisions gained from the current state help assess the individual
potential by introducing or using (X)AI technologies. To understand the company and working context, we investigated
the view of employees about AI in their company, including a look at the existing AI applications and those planned for
the future. For this, we formulated the following research questions:
RQ-C1: What motivations and risks for their company do employees see in using AI technologies?
RQ-C2: Do companies already use AI technology, and if so, which applications exist in companies?
RQ-C3: What are companies’ plans regarding using AI technologies?
Employees’ (X)AI Perspective Insights from the general attitude, knowledge, or acceptance of (X)AI technologies,
including demographics, from employees’ perspectives show the state of practical implementation in companies.
This guides the requirements regarding implementing (X)AI technologies. To investigate the personal perception of
employees regarding (X)AI and their experiences with AI technologies in their companies, we formulated the following
research questions:
RQ-E1: How do employees rate their (X)AI knowledge and attitudes towards (X)AI?
RQ-E2: How do employees rate the AI technologies used in their company?
RQ-E3: Is there a correlation between personal AI knowledge/attitude and the rating of AI technologies in the
company?
RQ-E4: How does the perception of the AI technology used in their company differ depending on demographic
data (e.g., age, educational attainment, company position)?
RQ-E5: How does the knowledge and attitude towards XAI relate to demographic data?
2.3 Methodology
We derived a survey with groups of questions addressing each of our formulated research questions.
Demographic Data We collected information regarding participants about their age, gender, educational background,
knowledge, and role in the company.
Company Information To get an overview of the size and domain of the company, we asked questions about the
sector and in which area (i.e., production or office work) the participants work. Here we used a combination of
predefined answers and free-form answers.
2
The National Academy of Science and Engineering (acatech) provides advice on strategic engineering and technology policy
issues to policymakers and the public
3Abbreviation interpretation of the research questions: RQ = research question, E = Employee, C = Company
3
摘要:

DOWENEEDEXPLAINABLEAIINCOMPANIES?INVESTIGATIONOFCHALLENGES,EXPECTATIONS,ANDCHANCESFROMEMPLOYEES’PERSPECTIVEKatharinaWeitzChairforHuman-CenteredAIUniversityofAugsburgUniversitätsstraße6a86159Augsburgkatharina.weitz@uni-a.deChiTaiDangChairforHuman-CenteredAIUniversityofAugsburgUniversitätsstraße6a8615...

展开>> 收起<<
DOWENEED EXPLAINABLE AIINCOMPANIES INVESTIGATION OF CHALLENGES EXPECTATIONS AND CHANCES FROM EMPLOYEES PERSPECTIVE.pdf

共11页,预览3页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:11 页 大小:413.49KB 格式:PDF 时间:2025-04-27

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 11
客服
关注