Gathering and Evaluating Disciplinary Insights Introduction For the next three weeks you will be gathering disciplinary insights, and beginning to critically evaluate them. This assignment will be a ‘check point’ for you. You will complete this exact same assignment next week, but with different sources.
Gathering and Evaluating Disciplinary Insights
Introduction
For the next three weeks you will be gathering disciplinary insights, and beginning to critically evaluate them. This assignment will be a ‘check point’ for you. You will complete this exact same assignment next week, but with different sources.
Process
You should already have a list of sources from your Proposal. Start by getting those sources and reading them.
After reading the article, you will summarize it (put it in your own words). Remember that you should do this in such a way that it will actually be of help to you later on when you write your paper. You will also include full bibliographic information. Use Purdue OwlLinks to an external site. if you are not sure how. You may use any standard citation style (APA), but you should do so consistently.
You will find plenty of citations in any scholarly source you are reading, and you might decide to read one of those next!
Product
You should submit a list of the sources you have read so far. For each source that was helpful–one that you might actually use in your paper–you should include a summary of the article, what discipline it comes from, the key insight/take-away, and a (short) critical evaluation. You might also include ‘future reading’. You need this level of detail for AT LEAST three sources.
Example
List of Sources Read so Far:
Greene, J. (2003). From neural ‘is’ to moral ‘ought’: what are the moral implications of neuroscientific moral psychology? Nature Reviews Neuroscience, 4: 846-849.
Greene, J. (2009). The Cognitive Neuroscience of Moral Judgment. In The Cognitive Neurosciences IV, ed. Gazzaniga, M. S., Cambridge, MA: MIT Press.
Greene, J. (2014). Beyond Point-and-Shoot Morality: Why Cognitive (Neuro)Science Matters for Ethics. Ethics, 124(4): 695-726.
FOR EACH SOURCE THAT WAS USEFUL, SUBMIT THE FOLLOWING:
Scholarly Source: Greene, J. D., Morelli, S. A., Lowenberg, K., Nystrom, L. E., & Cohen, J. D. (2008). Cognitive Load Selectively Interferes With Utilitarian Moral Judgment. Cognition, 107, 1144-1154.
Discipline: Psychology
Summary/ Key Insight: Greene claims that moral judgments come from two systems: System 1 is automatic, emotional, and tends toward deontological judgments. System 2 is effortful, reason-based, and tends towards consequential judgments. This article reviews Greene’s evidence for this, which comes from ‘trolley cases.’ In the switch version of this dilemma, a trolley is heading towards five people standing on the track, and participants decide whether it is morally permissible to flip a switch that redirects the trolley onto a different track with only one person on it. In the footbridge version, participants decide whether it is morally permissible to push a large man onto the tracks to stop a trolley from running over five people. While Greene and others think these dilemmas are essentially asking the same question (“would you sacrifice one person to save five?”), in switch, the majority of participants find it permissible to flip the switch, while in footbridge the majority of participants find it impermissible to push the large man. Which is to say, in switch, the majority of participants make the characteristically consequentialist judgment, while in footbridge, the majority of participants make the characteristically deontological judgment. Greene’s explanation of this finding is that deontological judgments are generated by type-1 processes, and such processes are triggered by the up close and personal nature of the harm in footbridge (i.e. physically pushing the man). In switch, on the other hand, the abstract nature of flipping a switch fails to trigger any of our type-1 responses and, in effect, our judgment is a product of our type-2 processing, which uses deliberative reasoning to add up the difference in lives lost.
The study I plan to use in my work from Greene et al. (2008) is that they found that loading working memory had no effect on utilitarian judgements: “there was no effect of load on judgment…with 61% utilitarian judgments under load (95% CI: 57%-66%) and 60% (95% CI: 55%-64%) in the absence of load” (Greene et al. 2008: 1047-1048). This is an interesting finding because it seems to contradict Greene’s prediction. He actually (nearly) says as much in his essay.
Critical Evaluation: First, a critical evaluation of the method. Greene’s method is all about testing people in lab-settings, rather than looking at how people actually respond in the real world. This is generally typical of a psychological method. Maybe he is hitting on one thin aspect of the causal story, but there is so much more to that story (esp. the social) that I am concerned that this may not generalize to actual cases. Second, a critical evaluation of the concepts. System 1 is defined as emotional (as opposed to reasoning) and deontological, but looking at how philosophers use the term ‘deontological’, it looks like it is all about operating in keeping with your reasons. Third, the sharp contrast between emotion and reason seems problematic, and promote the sharp dualistic thinking that can inhibit integration. However, the findings of the experiments are helpful. I think the issue arises when Greene tries to draw out the implications of those experiments.
Possible Further Reading:
Greene, J. (2003). From neural ‘is’ to moral ‘ought’: what are the moral implications of
neuroscientific moral psychology? Nature Reviews Neuroscience, 4: 846-849.
Greene, J. (2009). The Cognitive Neuroscience of Moral Judgment. In The Cognitive
Neurosciences IV, ed. Gazzaniga, M. S., Cambridge, MA: MIT Press.
Greene, J. (2014). Beyond Point-and-Shoot Morality: Why Cognitive (Neuro)Science Matters
for Ethics. Ethics, 124(4): 695-726.
Greene, J., Sommerville, R., Nystrom, L., Darley, J., & Cohen, J. (2001). An fMRI
investigation of emotional engagement in moral judgment. Science, 293: 2105-2108.
Greene, J., Nystrom, L., Engell, A., Darley, J., Cohen, J. (2004). The neural bases of cognitive
conflict and control in moral judgment. Neuron, 44: 389-400.
Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral
judgment. Psychological Review, 108, 814-834.
Haidt, J., & Hersh, M. (2001). Sexual morality: The cultures and emotions of conservatives
and liberals. Journal of Applied Social Psychology, 31, 191-221.
FOR THIS ASSIGNMENT, PLEASE USE THE FOLLOWING THREE SOURCES
Álvarez, J. M. S. (2023). Operation Barbarossa, the dissolution of the USSR and the Russo-Ukrainian War: 1917-2023. Advances in Social Sciences Research Journal, 10(10), 85–103. https://doi.org/10.14738/assrj.1010.15655
Blitstein, A. (2007). What Stalin knew: The enigma of Barbarossa (review). The Journal of Military History, 71(2), 561–562. https://doi.org/10.1353/jmh.2007.0094
Harward, G. T. (2021). “To the End of the Line”: The Romanian Army in Operation Barbarossa. The Journal of Slavic Military Studies, 34(4), 599–618. https://doi.org/10.1080/13518046.2022.2040834
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.