<?xml version="1.0" encoding="UTF-8"?>
<rdf:RDF xmlns="http://purl.org/rss/1.0/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/">
<channel rdf:about="https://hdl.handle.net/20.500.11811/61">
<title>Evangelisch-Theologische Fakultät</title>
<link>https://hdl.handle.net/20.500.11811/61</link>
<description/>
<items>
<rdf:Seq>
<rdf:li rdf:resource="https://hdl.handle.net/20.500.11811/13635"/>
<rdf:li rdf:resource="https://hdl.handle.net/20.500.11811/13615"/>
<rdf:li rdf:resource="https://hdl.handle.net/20.500.11811/13357"/>
<rdf:li rdf:resource="https://hdl.handle.net/20.500.11811/13350"/>
</rdf:Seq>
</items>
<dc:date>2026-04-05T22:32:37Z</dc:date>
</channel>
<item rdf:about="https://hdl.handle.net/20.500.11811/13635">
<title>Philipp Melanchthon, Epistolae Pauli ad Romanos, Enarratio</title>
<link>https://hdl.handle.net/20.500.11811/13635</link>
<description>Philipp Melanchthon, Epistolae Pauli ad Romanos, Enarratio
Melanchthon, Philipp
Wolter, Michael
Kritische Edition und deutsche Übersetzung von Philipp Melanchthons "Epistolae ad Romanos, Enarratio"
</description>
<dc:date>2025-01-01T00:00:00Z</dc:date>
</item>
<item rdf:about="https://hdl.handle.net/20.500.11811/13615">
<title>Consciousness and Human Brain Organoids</title>
<link>https://hdl.handle.net/20.500.11811/13615</link>
<description>Consciousness and Human Brain Organoids
Van Gyseghem, Aileen; Dierickx, Kris; Barnhart, Andrew J.
Human brain organoids (HBOs) are three-dimensional structures derived from human pluripotent stem cells that model aspects of fetal brain development. As HBO models grow more complex, ethical concerns arise, particularly around the potential for consciousness. Defining and detecting consciousness in HBOs remains unresolved, with existing theories offering conflicting predictions. This systematic review examines how consciousness is conceptualized in the ethical and philosophical literature concerning HBOs. We selected peer-reviewed publications written in English from 2013 onward that directly address consciousness regarding HBOs. After screening 51 sources, 24 were analysed in themes: Consciousness Terminology, Biological Limitations, Theories of Consciousness, Detecting Consciousness, Comparisons with Conscious Entities, and Special Entities. Uncertainty about consciousness in general complicates the conversation around HBOs. Clear communication is essential to avoid misconceptions, and future research may benefit from focusing on organoid intelligence as a more tractable concept.
</description>
<dc:date>2025-07-09T00:00:00Z</dc:date>
</item>
<item rdf:about="https://hdl.handle.net/20.500.11811/13357">
<title>Bytes the Dust: Normative Notions in Decommissioning Digital Doppelgängers</title>
<link>https://hdl.handle.net/20.500.11811/13357</link>
<description>Bytes the Dust: Normative Notions in Decommissioning Digital Doppelgängers
Barnhart, Andrew J.; Comerci, Giuseppe; Braun, Matthias
In recent debates on digital twins, much attention has been paid to understanding the interaction between individuals and their digital representations (Braun, 2021). Iglesias et al. (2025) shed new light on this debate, extending the reflection on digital doppelgängers—digital twins that try to replicate the psychological dimension of an individual. They argue that such copies may serve as valuable means to achieve legacy and relational aims left unaddressed due to the person's death. Against this background, we discuss how far we may better understand the implied normative aspects by considering them in terms of the represented person's death. Specifically, we ask how we can and should, in normative terms, deal with a digital twin as a representation of a person after their death.&lt;br /&gt; Here, we consider the decommissioning of such technology. We define decommissioning as the withdrawal, dismantling, or rendering the doppelgänger incapable of serving its original aims. We hypothesize that the way in which these digital doppelgängers ought to be decommissioned may depend upon whether they are viewed either as a proxy or as an extension of personhood. By proxy, we mean a stand-in for an individual by replicating their decisions and style without embodying their personal identity or subjective experience; something that makes decisions on your behalf but is not you (Braun and Krutzinna 2022). What is left behind is akin to an artifact owned by you. Whereas an extension of personhood can mean extending aspects of an individual's identity and relational presence beyond death by reflecting their values, projects, and relationships; something that is/was a part of yourself. What is left behind is akin to an "informational corpse" (Öhman and Floridi 2018).&lt;br /&gt; Answering this decommissioning question is necessary not only to respect the intended aims of those for whom the digital doppelgängers were created, but also to potentially respect certain social norms surrounding obsequies. Viewing digital doppelgängers either as proxies or extensions of personhood implies respective normative notions. For instance, the pursuit of any decommissioning strategy will require necesary and sufficient standards of informed consent, which may be difficult to parse given that not all individuals will view their digital doppelgänger in the same manner. The decommissioning of digital doppelgängers is thus enriched by moral nuances influenced by the perceptions we may have of this technology.
</description>
<dc:date>2025-01-29T00:00:00Z</dc:date>
</item>
<item rdf:about="https://hdl.handle.net/20.500.11811/13350">
<title>Tackling Structural Injustices</title>
<link>https://hdl.handle.net/20.500.11811/13350</link>
<description>Tackling Structural Injustices
Braun, Matthias; Bleher, Hannah; Hille, Eva Maria; Krutzinna, Jenny
In today's world, Artificial Intelligence plays a central role in many decision-making processes. However, its use can lead to structural and epistemic injustices— especially in the context of health. In 2019, for example, an algorithm used millions of times in American hospitals favored White patients over Black patients. The algorithm was used to predict the likelihood that patients would need additional medical care. Skin color itself was not considered as a variable. What was taken into account was rather the development of costs in the health sector. This correlated negatively with the level of health care costs in the underlying data sets. For a variety of reasons, Black patients had, on average, lower health care costs than White patients with the same medical conditions (Vartan 2019). In another case, it was observed that newborns with a positive screening result for rare diseases were diagnosed and treated later if they were patients of color (Zavala et al. 2021). What becomes evident in both cases with respect to different technologies is that there is a link between the use of new technologies and experiences of injustice for (different) marginalized groups that has not been sufficiently considered so far (Wachter 2022).&lt;br /&gt; Experiences of marginalization and invisibility based on specific characteristics such as skin color, gender, sexuality, ethnicity, socio-economic background, and others pose major challenges to questions of justice in dealing with new technologies such as novel genetic tests or algorithmic decisions as in the examples. Depending on the characteristic and the value attached to it, people have different experiences. Experience is not just an abstract category here. It also refers to specific claims to be visible in public space and how difficult it can sometimes be to assert rights to good treatment (Braun and Krutzinna 2022). In this short paper, we argue how central it is to focus on negotiations of social recognition from an ethics of life forms perspective in order to combat the experiences of injustices caused by new forms of technology.
</description>
<dc:date>2023-06-20T00:00:00Z</dc:date>
</item>
</rdf:RDF>
