How do I continue to work on the problem scope for your community needs assessment plan? In particular, you will consider your role in addressing the
How do I continue to work on the problem scope for your community needs assessment plan? In particular, you will consider your role in addressing the social problem of inclusion for individuals with developmental disabilities. For instance, what can you do, as a practitioner, to educate the public about the problem? How can you unite key stakeholders to address the problem? How will you encourage those stakeholders to engage in systems thinking in order to develop a sustainable solution?
As you continue to work on the problem scope, you will also consider ethical and moral issues that may arise when solving the problem and potential opportunities to increase cultural diversity, equity, inclusion, and empowerment. Be sure to draw upon this week’s Discussion as you address these areas. The discussion about power differentials should help you identify opportunities to increase cultural diversity, equity, inclusion, and empowerment when solving the problem.
2 to 3 pages (not including title or reference page) that discuss:
- Refine the Problem Statement. Refine your problem statement.
- Explain Your Role in Addressing the Problem. Explain your role, as an advanced human services professional practitioner, in addressing the problem.
- Identify Ethical and Moral Dilemmas. Describe one ethical dilemma and one moral dilemma that could arise when solving the problem. Explain how you would prevent each using the Ethical Standards for Human Services Professionals.
- Increase Cultural Diversity, Equity, Inclusion, and Empowerment. Explain one opportunity that exists to increase each of the following areas when addressing the social problem: equity, cultural diversity, inclusion, and empowerment. Pay close attention to who holds the power and how to attend to these issues through ethics and change management approaches.
References:
Stroh, D. P. (2015). Systems thinking for social change: A practical guide to solving complex problems, avoiding unintended consequences, and achieving lasting results. Chelsea Green Publishing.
- Chapter 3, “Telling Systems Stories” (pp. 29–44)
- Chapter 5, “An Overview of the Four-Stage Change Process” (pp. 71–78)
National Community Development Institute. (n.d.). Sources of power.Links to an external site. http://www.buildingmovement.org/wp-content/uploads/2020/03/Sources_of_Power.pdf
National Organization for Human Services. (2015). Ethical standards for human services professionals.Links to an external site. https://www.nationalhumanservices.org/ethical-standards-for-hs-professionals
Greene, J. & Haidt, J. (2002). How (and where) does moral judgement work?Links to an external site. Trends in Cognitive Sciences, 6(12), 517–523. https://go.openathens.net/redirector/waldenu.edu?url=https://doi.org/10.1016/S1364-6613(02)02011-9
This tool is helpful for
organizations working
in collaborations.
Users will understand
the various ways that
different participation
and communication
styles in a group setting
are perceived. The tool
also encourages users to
consider ways to share
power in order to establish
trust.
NCDI, www.ncdinet.org
Sources of Power
1. Positional power comes from organizational authority or position (people providing capacity building technical support have this power). It is often forgotten by people with the power, rarely forgotten by those without it.
2. Referred power comes from connections to others (e.g., a staff member without formal posi-
tional power but has known the ED for years).
3. Expert power comes from wisdom, knowledge, experience and/or skills (e.g., someone
is widely respected because of her skills as an organizer).
4. Ideological power comes from an idea, vision or analysis. As Victor Hugo said, "Nothing can
withstand the power of an idea whose time has come." It can be an individual's original idea, an ideal such as "democracy" or "liberation," or a developed ideology.
5. Obstructive power stems from the ability to coerce or block. Whether implicit, threatened or
demonstrated, those without other sources of power may depend on it. Many activists are experts in its use.
6. Personal power includes energy, vision, ability to communicate, capacity to influence, emo-
tional intelligence, psychological savvy, etc.
7. Co-powering is an idea that comes from the Latino community. It speaks to the responsibility
for individual leaders to mindfully work towards supporting the personal power of others through modeling, validating and giving feedback.
8. Collaborative power comes from our ability to join our energies in partnership with others in
pairs, teams, organizations, communities, coalitions and movements.
9. Institutional power means economic, legal and political power directly wielded by institutions,
whether it's a corporation, police department or one of your organizations. This institution exists apart from the individuals who work there at any one time and enjoys name recognition, mem- bership, etc.
1O. Cultural power, from the perspective of the dominant culture, means cultural norms, condition-
ing and privilege regarding race/class/gender/age. (As with positional power, this power is often invisible to the dominant group. To those with less power, it is a real and everyday experience.) From the perspective of oppressed peoples, cultural power means a consciousness of community, class or culture that serves to empower.
11. Structural power is power that's covertly or implicitly exercised through the dominant institu-
tions of society (e.g., the resistance to alternative medicine from the AMA and insurance provid- ers, racism expressed and maintained through structures such as red-lining by lending institu- tions).
12. Transcendent power comes from our connection to a higher power such as spiritual, natural
and/or historical imperative.
National Community Development Institute 17
This tool can be used
for building a sense of
community in a working
group. To maximize
effectiveness, consider
using this tool as soon
as you have a sense of
the participants and the
power dynamics that
exist. To neutralize power
differences, work with the
group to establish group
agreements, roles and
strategies.
Minimizing Power Differences
Dealing with power differences in an open and honest way is a key ingredient to success in commu- nity building. This is not always easy but essential to building relationships.
• Orient. Effective orientation of new members helps to equalize power relationships.
• Use ground rules or group agreements. Ground rules or group agreements work best if they
are developed by the entire group, acknowledged, hung in a prominent place during meetings/ activities and periodically reviewed by the group to see if they are being honored or if there are additions or deletions.
• Increase numbers of those with less power. The best way to begin dealing with this is to
have open discussions in your collaborative. If you are sincere about wanting to change the power dynamics of your group, you are the best experts on how to do this. For example, if you are looking to increase the number of parents of young children, you may have to change meet- ings to evening events with potlucks and childcare.
• Make special preparations. For example, schools and community based programs that
successfully involve parents in decision making make special efforts to inform and orient them. Approaches include special training, meeting in advance in small groups of all or mostly parents, or one-on-one conversations with parent representatives before and after meetings.
• Offer special support. Transportation and childcare reimbursements and stipends are com-
monly needed supports for parents. Remember, professionals who attend your collaborative meetings may or may not have an institution that is covering their time and expenses. Partici- pants experience these differences in support as part of the power relationship.
• Listen to and respect all members. As leader or member, your role-modeling can help equal-
ize power relationships in the collaboration.
• Be a relationship builder. Help facilitate and establish personal relationships between leader-
ship in the collaborative and those with less positional power. People need to know their ideas will be heard and respected.
18 National Community Development Institute
,
TRENDS in Cognitive Sciences Vol.6 No.12 December 2002
http://tics.trends.com 1364-6613/02/$ – see front matter © 2002 Elsevier Science Ltd. All rights reserved. PII: S1364-6613(02)02011-9
517Review
Joshua Greene
Dept of Psychology, Green Hall, Princeton University, Princeton, NJ 08544-1010, USA. e-mail: [email protected]
Jonathan Haidt
Dept of Psychology, University of Virginia, P.O. Box 400400, Charlottesville, VA 22904- 4400, USA. e-mail: [email protected]
Why do we care so strongly about what other people do, even when their actions won’t affect us? And how do we decide that someone else has done something wrong? These questions are at the heart of moral psychology, and psychologists’answers to these questions have changed with intellectual fashion. Historically, psychologists have disagreed about whether moral judgments are primarily products of emotional and non-rational processes (such as Freudian internalization or behaviorist reinforcement) or of reasoning and ‘higher’cognition (as in Piaget’s and Kohlberg’s post-conventional reasoning). Recently, however, findings from several areas of cognitive neuroscience have begun to converge on an answer: emotions and reasoning both matter, but automatic emotional processes tend to dominate.
Trends in moral psychology
During the cognitive revolution of the 1950s and 1960s, behaviorist and Freudian theories gave way to mental models and information processing as the preferred framework in psychology. In the moral domain, Lawrence Kohlberg was a part of this revolution. He built on the earlier work of Jean Piaget [1] to develop a six-stage model of the development of moral reasoning [2]. According to Kohlberg, moral growth is driven not by simple brain maturation but rather by experience in ‘role taking’, or looking at a problem from multiple perspectives. Role taking improves moral reasoning, and moral reasoning, Kohlberg thought, drives moral judgment.
But as the cognitive revolution matured in the 1980s, many researchers began calling for a complementary ‘affective revolution’. Kohlberg’s focus on moral reasoning seemed to ignore the importance of moral emotions. At the same time, new findings in evolutionary psychology [3,4] and primatology [5] began to point to the origins of human morality in a set of emotions (linked to expanding cognitive abilities) that make individuals care about the
welfare of others (e.g. kin altruism, including feelings of sympathy), and about cooperation, cheating, and norm-following (e.g. reciprocal altruism, including feelings of shame, gratitude and vengeance).
Integrating affect and reasoning
In the 1990s the affective revolution was reinforced by a new focus on ‘automaticity’– the mind’s ability to solve many problems, including high-level social ones, unconsciously and automatically [6]. A recent comprehensive model, the social intuitionist model [7], brings together research on automaticity with findings in neuroscience and theory in evolutionary psychology. This model suggests that moral judgment is much like aesthetic judgment: we see an action or hear a story and we have an instant feeling of approval or disapproval. These feelings are best thought of as affect-laden intuitions, as they appear suddenly and effortlessly in consciousness, with an affective valence (good or bad), but without any feeling of having gone through steps of searching, weighing evidence, or inferring a conclusion. These intuitions – for example, about reciprocity, loyalty, purity, suffering – are shaped by natural selection, as well as by cultural forces. People certainly do engage in moral reasoning, but, as suggested by studies of informal reasoning [8], these processes are typically one-sided efforts in support of pre-ordained conclusions. (As William James said, ‘A great many people think they are thinking when they are merely rearranging their prejudices.’) Moral reasoning matters, but it matters primarily in social contexts in which people try to influence each other and reach consensus with friends and allies.
This emphasis on quick, automatic affective reactions is supported by recent findings in social psychology, such as: (1) that people evaluate others and apply morally laden stereotypes automatically [9]; (2) that motivations to maintain relationships and defend against threatening ideas bias judgments and motivate subsequent reasoning [10,11]; and (3) that people can very easily construct post-hoc reasons to justify their actions and judgments [12–14].
Somatic markers and decision-making
In keeping with this affective trend, Antonio Damasio and colleagues have generated widespread interest in the affective neural bases of social judgment through their ongoing study of patients with damage to the ventral and medial portions of the frontal lobes [15,16]. To varying degrees, these patients resemble
Moral psychology has long focused on reasoning, but recent evidence suggests
that moral judgment is more a matter of emotion and affective intuition than
deliberate reasoning. Here we discuss recent findings in psychology and
cognitive neuroscience, including several studies that specifically investigate
moral judgment. These findings indicate the importance of affect, although they
allow that reasoning can play a restricted but significant role in moral judgment.
They also point towards a preliminary account of the functional neuroanatomy
of moral judgment, according to which many brain areas make important
contributions to moral judgment although none is devoted specifically to it.
How (and where) does moral
judgment work?
Joshua Greene and Jonathan Haidt
Phineas Gage, the 19th century railroad foreman who made neurological history after an accidental explosion sent a tamping iron through his medial prefrontal cortex, robbing him of his sound judgment and remarkably little else [15,16]. Damasio and colleagues argue that contemporary patients with Gage-like damage (such as patient EVR [17]), have emotional deficits and, more specifically, an inability to generate and effectively use ‘somatic markers’, neural representations of body states that imbue behavioral options with affective significance and thus guide on-line decision-making. These patients’deficits are revealed in their abnormal skin-conductance responses (an index of autonomic arousal) and poor performance on the Iowa Gambling Task, which simulates real-life decision-making [17]. These deficits exist to a surprising extent against a background of preserved ‘cognitive function’as indexed by IQ tests and other measures. Moreover, such patients exhibit preserved abstract social knowledge in spite of their disastrous real-life judgment. Their affective deficits render them unable to feel their way through life, which suggests that normal decision-making is more emotional and less reasoned than many have believed [15].
Frontal damage and anti-social behavior
Frontal patients like EVR (see above) are more likely to hurt themselves than other people, but a recent study by Anderson et al. of two patients with prefrontal damage acquired during early childhood reports behavior that is more flagrantly immoral [18,19]. These patients lie, steal, have neglected their children, and at times have been physically aggressive – all without apparent remorse. Both patients perform reasonably well on IQ tests and other standard cognitive measures and perform poorly on the Iowa Gambling Task, but unlike adult-onset patients their knowledge of social/moral norms is deficient. Their moral reasoning appears to be, in the terminology of Kohlberg, ‘preconventional’, conducted from an egocentric perspective in which the purpose is to avoid punishment. Other tests show that they have a limited understanding of the social and emotional implications of decisions and fail to identify primary issues and generate appropriate responses to hypothetical social situations. Thus, it appears that the brain regions compromised in these patients (ventral, medial, and polar aspects of the prefrontal cortex) include structures that are crucial not only for on-line decision- making but also for the acquisition of social knowledge and dispositions towards normal social behavior.
Other studies have documented impaired social behavior resulting from frontal dysfunction [20]. In a study of 279 Vietnam War veterans, Grafman and colleagues found that patients with ventromedial frontal damage tend towards violence and aggression [21]. Raine and colleagues have found that individuals diagnosed with anti-social personality disorder have reduced prefrontal gray matter and exhibit reduced autonomic responses to the performance of a socially
stressful task [22], and a recent popular account documenting a large number of case studies attributes violent criminal behavior to a combination of childhood abuse and frontal damage [23].
Psychopaths exhibit extreme anti-social behavior in the absence of observed brain damage. However, a recent neuroimaging study has shown that psychopaths exhibit less emotion-related neural activity than control subjects while responding to emotionally valenced words [24]. Blair and others have conducted several studies that characterize psychopathy as an affective disorder involving a reduction in empathy [25] and consequent deficits in moral judgment both in and out of the laboratory [26]. Blair observes that psychopaths, unlike frontal patients, are marked by their tendency towards instrumental rather than reactive aggression [27].
Neuroimaging
Responses to moral sentences and pictures A handful of recent studies have used functional neuroimaging to study moral psychology. In an fMRI study, Jorge Moll and colleagues [28] presented subjects with simple claims, some with moral content (‘They hung an innocent’) and others without moral content (‘Stones are made of water’). Judgments in response to claims with moral content produced increased activity bilaterally in the frontal pole, in the medial frontal gyrus, right cerebellum, right temporal pole, superior temporal sulcus (STS), left orbitofrontal cortex (OFC), left precuneus, and the posterior globus pallidus. A more recent study by Moll and colleagues [29] compared judgments in response to simple moral claims with judgments in response to unpleasantly valenced non-moral claims with social content, many of which evoke disgust (e.g. ‘He licked the dirty toilet’, ‘Pregnant women often throw up’). A direct comparison of these two conditions revealed greater activity in the left medial OFC for the moral condition and greater activation in the left lateral OFC as well as the left amygdala for the non-moral/social condition. These results suggest a functional dissociation between neural networks within the OFC and associated structures that specialize in processing different kinds of social/emotional information relevant (in varying degrees) to moral judgment. A third study by Moll and colleagues [30] found similar neural responses to pictures with moral content (e.g. physical assaults, poor abandoned children). The medial frontal and posterior cingulate regions were also activated in an fMRI study of empathy and forgiveness [31]. (See also Table 1.)
Emotional engagement in ‘personal’ versus ‘impersonal’ moral judgments Whereas Moll and colleagues have investigated moral cognition by distinguishing the effects of moral versus non-moral phenomena, Greene and colleagues [32] have drawn a distinction within the moral domain between ‘personal’and ‘impersonal’moral judgments (see Box 1). Greene and colleagues scanned subjects
TRENDS in Cognitive Sciences Vol.6 No.12 December 2002
http://tics.trends.com
518 Review
using fMRI while they responded to a series of personal and impersonal moral dilemmas as well as non-moral dilemmas, all of which involved complex narratives. They found that responding to personal moral dilemmas, as compared with impersonal and non-moral dilemmas, produced increased activity in areas associated with social/emotional processing: medial frontal gyrus, posterior cingulate gyrus, and bilateral STS (originally labeled ‘angular gyrus’). By contrast, impersonal and non-moral dilemmas as compared with personal dilemmas produced increased activity in areas associated with working memory: dorsolateral prefrontal and parietal areas (see Fig. 1). They found comparatively little difference between the impersonal-moral and non-moral conditions, suggesting that impersonal moral judgment has less in common with personal moral judgment than with certain kinds of non-moral practical judgment.
Greene et al. carried out an analysis of subjects’ reaction times to link these imaging data to behavior. Subjects were slow to approve of personal violations but relatively quick to condemn them. By contrast, approvals and disapprovals took equally long for impersonal moral and non-moral judgments. This pattern is explained by subjects’having to overcome their negative emotional responses when approving of personal moral violations as compared with other, less emotionally charged actions (this might be likened to the pattern of interference observed in the Stroop task).
The neuroanatomy of moral judgment
The functional neuroimaging boom has provided a wealth of information about the neuroanatomy of emotion, social cognition, and other neural processes. These data, combined with the lesion and pathology data above, allow us to interpret the results of the imaging studies described in the previous section and thus broaden and refine our understanding of the ‘moral brain’.
TRENDS in Cognitive Sciences Vol.6 No.12 December 2002
http://tics.trends.com
519Review
Suppose a runaway trolley is about to run over and kill five people. Suppose further that you can hit a switch that will divert the trolley onto a different set of tracks where it will kill only one person instead of five. Is it okay to hit the switch? Now, what if the only way to save the five people were to push a large person (larger than yourself) in front of the trolley, killing him but saving the others? Would that be okay?
Most people say ‘yes’ to the first case and ‘no’ to the second in spite of the fact that these cases are so similar [a]. Although it is easy to generate (and surprisingly difficult to defend) hypotheses about why one ought to treat these cases differently [b], Greene et al. have attempted to explain how people do in fact arrive at this puzzling pair of conclusions [a]. To explain the difference, they posit a distinction between what they believe are two fundamentally different kinds of moral thinking, drawing on capacities that emerged at different stages of human evolution. On the one hand, moral thinking is driven largely by social-emotional dispositions built on those we inherited from our primate ancestors [c,d]. At the same time, humans have a unique capacity for sophisticated abstract reasoning that can be applied to any subject matter. One might suppose, then, that human moral thinking is not one kind of process, but rather a complex interplay between (at least) two distinct types of processes: domain-specific, social-emotional responses and domain-neutral reasoning processes applied in moral contexts.
With this in mind, Greene and colleagues distinguished between ‘personal’ and ‘impersonal’ moral violations and judgments. A moral violation is personal if it is: (i) likely to cause serious bodily harm, (ii) to a particular person, (iii) in such a way that the harm does not result from the deflection of an existing threat onto a different party. A moral violation is impersonal if it fails to meet these criteria. One can think of these criteria for personal harm in terms of ‘ME HURT YOU’, and as delineating roughly those violations that a chimpanzee can appreciate. The ‘HURT’ condition picks out roughly the kinds of harm that a chimp can understand (e.g. assault, as opposed to, say, tax evasion). The ‘YOU’ condition requires that the victim be vividly represented as an individual. Finally, the ‘ME’ condition captures the notion of ‘agency’ [b], the idea that the action must spring in a vivid way from an agent’s will, must be ‘authored’ rather than merely ‘edited’ by an agent.
Pushing someone in front of a trolley meets all three criteria and is therefore personal, whereas diverting a trolley involves merely deflecting an existing threat, removing the crucial sense of agency and therefore making this violation impersonal.
References
a Greene, J.D. et al. (2001) An fMRI investigation of emotional engagement in moral judgment. Science 293, 2105–2108
b Thomson, J.J. (1986) Rights, Restitution, and Risk: Essays in Moral Theory, Harvard University Press
c Flack, J.C. and de Waal, F.B.M. (2000) ‘Any animal whatever’: Darwinian building blocks of morality in monkeys and apes. In Evolutionary Origins of Morality (Katz, L.D., ed.), pp. 1–29, Imprint Academic
d Haidt, J. (2001) The emotional dog and its rational tail: a social intuitionist approach to moral judgment. Psychol. Rev. 108, 814–834
Box 1. Two kinds of moral thinking: personal and impersonal
Fig. 1. Brain areas (indicated by Brodmann’s area (BA)) exhibiting differences in activity in response to personal moral dilemmas as compared with impersonal and non-moral dilemmas [32]. Areas exhibiting greater activity for personal moral dilemmas (as compared with impersonal and non-moral): medial frontal gyrus (BA 9/10); posterior cingulate gyrus (BA 31); superior temporal sulcus,
inferior parietal lobe (BA 39). Areas exhibiting greater activity for impersonal moral dilemmas (as compared with personal): dorsolateral prefrontal cortex (BA 46); parietal lobe (BA 7/40). Images are reversed left to right according to radiologic convention. (Reprinted with permission from Greene et al. [32]. Copyright 2001 American Association for the Advancement of Science.)
The ventral and medial prefrontal cortices Table 1 and Fig. 2 summarize results relevant to the neuroanatomy of moral judgment. One brain area of great interest is the medial frontal gyrus, around the border of Brodmann areas (BA) 9 and 10, which
probably serves in the integration of emotion into decision-making and planning [33,34] and might also play a role in theory of mind (ToM, the capacity to represent others’mental states) and other specifically social functions relevant to moral judgment [35,36]. This
TRENDS in Cognitive Sciences Vol.6 No.12 December 2002
http://tics.trends.com
520 Review
Table 1. The moral brain (see also Fig. 2, color-coded to this figure.) The first column lists eight brain areas (Brodmann’s areas in
parentheses) implicated in moral cognition by neuroimaging studies. Subsequent columns provide additional information about
their functions.
Brain region
(with BA)
Associated moral tasks Other associated tasks Social pathology from
damage
Likely functions
1. Medial Personal moral Attributing intentionality to moving Poor practical Integration of emotion frontal gyrus judgments shapes and cartoon characters* judgment [15,16], into decision-making (BA 9/10) Impersonal moral Theory of mind (ToM) stories and cartoons* Reactive aggression and planning [15,16],
judgments Representing a historical figure's [27] and (primarily in esp. for conscious (relative to non- mental states* [36] developmental processes [33] moral) [32] Viewing angry/sad faces [47] cases) diminished ToM [36] Simple moral pleasant pictures, negative pictures empathy and social judgments* [28] [48] (with emotional report [49]) knowledge [18] Viewing moral Reward [37] pictures [30] Viewing and/or recall of happy, sad, and Forgivability disgusting films [50] judgments [31]* Emotional autobiographical recall [51] (*also lateral Emotional planning [34] frontopolar) ‘Rest’ [42]
*(focus in the paracingulate sulcus)
2. Posterior Personal moral Hearing affective autobiographical episodes Impaired recognition Integration of emotion, cingulate, judgments [52], threat words [38] memory for faces imagery (esp. precuneus precuneus, Impersonal moral Reading coherent stories, esp. ToM stories [53] Capgras delusion? [55] [39]), and memory [38], retrosplenial judgments (relative to Viewing ToM cartoons [54], familiar faces [55], esp. for coherent social cortex (BA 31/7) non-moral) [32] disgusted faces, sad faces, snake video, narratives
Simple moral judgments previously experienced robbery video, [28] combat pictures (and imagery) Forgivability judgments Sad autobiographical recall (men) [38] [31] Recognizing neutral words from negative Moral pictures [30] context [56]
Emotional planning [34] Recall of happy personal life episodes [57], imaginable word pairs [39] ‘Rest’ [42]
3. Superior Personal moral Viewing biological motion (hands, faces, eyes, Impaired judgment from Supporting representations temporal sulcus, judgments [32] body) [40]; sad faces [47]; happy, sad, and eye gaze (monkeys) [40] of socially significant inferior parietal Simple moral judgments disgusting films [50,51]; ToM cartoons, Capgras delusion? [41] movements [40], and lobe [28,29] reading coherent stories with self- possibly complex (BA 39) Moral pictures [30] perspective and with characters , esp ToM representations of
attributing intentionality to moving shapes ‘personhood’ [41] Representing a historical figure's mental ToM [36] states [36] Recognizing neutral words from negative context [56] Recall of imaginable word pairs [39] Judgment of indoor/outdoor vs. subjective response to (un)pleasant pictures [49] Emotional film viewing vs. recall [51] ‘Rest’ [42]
4. Orbitofrontal/ Simple moral judgments Reward/punishment [37] Poor practical judgment Representation of ventromedial [28,29] Sad autobiographical recall [57] [15,16] reward/punishment value frontal cortex Moral pictures [30] Recognizing words from positive context [56] Reactive aggression [27] [15,16,37] (BA 10/11) Viewing angry faces [47] and (primarily in control of inappropriate/
‘Rest’ [42] developmental cases) disadvantageous (Note: absent in many PET studies diminished empathy and behavior [15,27] of emotion [34,48–50]) social knowledge [18] ȁ
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.