Federal Family Minister Karin Prien (CDU) warns that artificial intelligence can reproduce discrimination. She told newspapers of the Funke Media Group that AI is trained on internet data, inheriting the prejudices it contains. “Responding to this is not just about regulation but also about greater transparency and clear quality standards” she said.
Prien cited the automated pre‑selection of job applications as an example. “If such systems are trained on a company’s historical data, women can be disadvantaged” she cautioned. “When men have historically dominated certain roles, the AI learns those patterns”.
She called for explicit standards regarding the transparency of training data, mechanisms to test for discrimination risks, and stronger diversity within development teams. “We must ask whether the design, training and deployment of these systems adequately consider women’s perspectives” the women’s minister added. Today, women are markedly underrepresented in AI development and technical leadership positions, shaping the questions, data choices and evaluation criteria. Whether this requires traditional regulation, self‑commitments or certifications must be discussed in detail.



