Case Study from the year 2023 in the subject Computer Science - Internet, New Technologies, grade: 1,0, Charles University in Prague (Faculty of Social Sciences), course: Security and Technology, language: English, abstract: This paper tries to answer the question, how the LA Police Department uses AI (Artificial Intelligence) algorithms to profile criminality. Can these algorithms always be fair and impartial? In today's society, we are constantly surrounded by artificial intelligence, especially by algorithms that try to profile us to predict our future behaviour. These algorithms are deeply embedded in our daily lives, whether we are watching Netflix, using Amazon music or do online shopping, they try to present us with suggestions to get us more involved with the platform to spend more time there and therefore more money. But these algorithms are not only operated by business companies alone, they are also implemented with public policy providers and law enforcement agencies, like police forces. First, the author gives a detailed explanation of how these algorithms work, then he lists examples in which areas police forces might use these algorithms, such as facial recognition software and making suggestions about the sentencing of criminals. However, the paper also discusses critique of the use of technology by law enforcement. Many ascribe these algorithms a certain bias, especially racially, having mainly been trained on a dataset, which in itself is biased. The next chapter contains the author's methodology, which he then applies to his case study about the LA Police Department in the last part of his paper.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
Vendeur : BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -Seminar paper from the year 2023 in the subject Computer Science - Internet, New Technologies, grade: 1,0, Charles University in Prague (Faculty of Social Sciences), course: Security and Technology, language: English, abstract: This paper tries to answer the question, how the LA Police Department uses AI (Artificial Intelligence) algorithms to profile criminality. Can these algorithms always be fair and impartial In today¿s society, we are constantly surrounded by artificial intelligence, especially by algorithms that try to profile us to predict our future behaviour. These algorithms are deeply embedded in our daily lives, whether we are watching Netflix, using Amazon music or do online shopping, they try to present us with suggestions to get us more involved with the platform to spend more time there and therefore more money. But these algorithms are not only operated by business companies alone, they are also implemented with public policy providers and law enforcement agencies, like police forces. First, the author gives a detailed explanation of how these algorithms work, then he lists examples in which areas police forces might use these algorithms, such as facial recognition software and making suggestions about the sentencing of criminals. However, the paper also discusses critique of the use of technology by law enforcement. Many ascribe these algorithms a certain bias, especially racially, having mainly been trained on a dataset, which in itself is biased. The next chapter contains the author's methodology, which he then applies to his case study about the LA Police Department in the last part of his paper. 20 pp. Englisch. N° de réf. du vendeur 9783346848796
Quantité disponible : 2 disponible(s)
Vendeur : buchversandmimpf2000, Emtmannsberg, BAYE, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - Print on Demand Titel. Neuware -Case Study from the year 2023 in the subject Computer Science - Internet, New Technologies, grade: 1,0, Charles University in Prague (Faculty of Social Sciences), course: Security and Technology, language: English, abstract: This paper tries to answer the question, how the LA Police Department uses AI (Artificial Intelligence) algorithms to profile criminality. Can these algorithms always be fair and impartial In today¿s society, we are constantly surrounded by artificial intelligence, especially by algorithms that try to profile us to predict our future behaviour. These algorithms are deeply embedded in our daily lives, whether we are watching Netflix, using Amazon music or do online shopping, they try to present us with suggestions to get us more involved with the platform to spend more time there and therefore more money. But these algorithms are not only operated by business companies alone, they are also implemented with public policy providers and law enforcement agencies, like police forces.First, the author gives a detailed explanation of how these algorithms work, then he lists examples in which areas police forces might use these algorithms, such as facial recognition software and making suggestions about the sentencing of criminals. However, the paper also discusses critique of the use of technology by law enforcement. Many ascribe these algorithms a certain bias, especially racially, having mainly been trained on a dataset, which in itself is biased. The next chapter contains the author's methodology, which he then applies to his case study about the LA Police Department in the last part of his paper.Books on Demand GmbH, Überseering 33, 22297 Hamburg 20 pp. Englisch. N° de réf. du vendeur 9783346848796
Quantité disponible : 1 disponible(s)
Vendeur : AHA-BUCH GmbH, Einbeck, Allemagne
Taschenbuch. Etat : Neu. Druck auf Anfrage Neuware - Printed after ordering - Case Study from the year 2023 in the subject Computer Science - Internet, New Technologies, grade: 1,0, Charles University in Prague (Faculty of Social Sciences), course: Security and Technology, language: English, abstract: This paper tries to answer the question, how the LA Police Department uses AI (Artificial Intelligence) algorithms to profile criminality. Can these algorithms always be fair and impartial In today¿s society, we are constantly surrounded by artificial intelligence, especially by algorithms that try to profile us to predict our future behaviour. These algorithms are deeply embedded in our daily lives, whether we are watching Netflix, using Amazon music or do online shopping, they try to present us with suggestions to get us more involved with the platform to spend more time there and therefore more money. But these algorithms are not only operated by business companies alone, they are also implemented with public policy providers and law enforcement agencies, like police forces.First, the author gives a detailed explanation of how these algorithms work, then he lists examples in which areas police forces might use these algorithms, such as facial recognition software and making suggestions about the sentencing of criminals. However, the paper also discusses critique of the use of technology by law enforcement. Many ascribe these algorithms a certain bias, especially racially, having mainly been trained on a dataset, which in itself is biased. The next chapter contains the author's methodology, which he then applies to his case study about the LA Police Department in the last part of his paper. N° de réf. du vendeur 9783346848796
Quantité disponible : 1 disponible(s)
Vendeur : preigu, Osnabrück, Allemagne
Taschenbuch. Etat : Neu. Biased Algorithms in Law Enforcement Agencies. A Case Study of the LA Police Department | Julian Schoenemeyer | Taschenbuch | Englisch | 2023 | GRIN Verlag | EAN 9783346848796 | Verantwortliche Person für die EU: preigu GmbH & Co. KG, Lengericher Landstr. 19, 49078 Osnabrück, mail[at]preigu[dot]de | Anbieter: preigu Print on Demand. N° de réf. du vendeur 126778983
Quantité disponible : 5 disponible(s)