In the article, How (Un)ethical are You Actions?, ?the authors discuss bias that emerges from unconscious beliefs.?? Based on the reading, your external research, and your analysis, and b
- Put your name on your assignment.
- Double space.
- Page numbers.
- Citing format as demonstrated in the first class and posted on Canvas.
- Word number guidelines are for average work.The assignment is based on your reading, required external research, and analysis of “How (Un)ethical are You” by Mahzarin R. Banaji, Max H. Bazerman and Dolly Chugh.”
In the article, “How (Un)ethical are You Actions”, the authors discuss bias that emerges from unconscious beliefs.
Based on the reading, your external research, and your analysis, and based on the following scenario, you are to answer the question below:
- Scenario:
- YOU are the manager of a small, technical project team.
- YOU have been given the responsibility of hiring a new member of the team with a specific, technical skill set.
- How would you make an ethical, unbiased, hiring decision?
- The Assignment:
- 80 Points: Based on the course reading (Banaji, et. al.) and materials, your own external scholarly research (minimum of 2 required), and your thoughts and analysis, write a paragraph (200 words is average work), and worth 20 points each, identify and describe 4 strategies that you would use to insure you make an ethical, unbiased, hiring decision. Only 2 of your answers may be sourced from the Banaji, et. al. reading. For each, write about what you would specifically do and why. Be specific about what actions you would take. Format your answers as follows:
- Hiring Strategy 1:
- Hiring Strategy 2:
- Hiring Strategy 3:
- Hiring Strategy 4:
- 20 Points: Write a paragraph (200 words is average work), about your own personal experience with being hired. Was it ethical, or not? Explain your answer in detail using concrete examples. Base your analysis of your own experience in the readings and your research. If you haven't yet gone through a hiring process for any type of job, write about what kind of process you would hope to experience and why.
- 80 Points: Based on the course reading (Banaji, et. al.) and materials, your own external scholarly research (minimum of 2 required), and your thoughts and analysis, write a paragraph (200 words is average work), and worth 20 points each, identify and describe 4 strategies that you would use to insure you make an ethical, unbiased, hiring decision. Only 2 of your answers may be sourced from the Banaji, et. al. reading. For each, write about what you would specifically do and why. Be specific about what actions you would take. Format your answers as follows:
- Guidance:
- Do not write in generalities or theories on ethics, bias, team management, etc.
- Write specifically about what YOU would do to insure YOU act ethically and without bias to make the hiring decision.
- For best results do the following;
- Read "How (Un)Ethical are You"
- Research ethics, unconscious bias/beliefs, management and review any applicable lecture and your notes.
- You should have a minimum of two outside sources (articles). You must provide attribution to your sources (e.g., Harvard Business Review, the NY Times, in-class discussion, any of the class readings, Rutgers LinkedIn Learning
Good managers often make unethical decisions-and don't even know it.
How {Un)ethical Are You?
by Mahzarin R. Banaji, Max H. Bazerman, and Dolly Chugh
~
~ '" ~ r <.J 2 -' -'« :io
~ 2 '"8 <.J Z I V> ::J
'"=> Q.
8 r ~ V> V> W Z Vi => '"Cl
'" ~ « r
~ 19 f0 r <.J 2
~ u
Answer true or false: "I am an ethical man ager."
If you answered "true," here's an uncom fortable fact: You're probably not. Most of us believe that we are ethical and unbiased. We imagine we're good decision makers, able to objectively size up a job candidate or a venture deal and reach a fair and rational conclusion that's in our, and our organization's, best inter ests. But more than two decades of research confirms that, in reality, most of us fall woe fully short of our inflated self-perception. We're deluded by what Yale psychologist David Armor calls the illusion of objectivity, the notion that we're free of the very biases we're so quick to recognize in others. What's more, these unconscious, or implicit, biases can be contrary to our consciously held, ex plicit beliefs. We may believe with confidence and conviction that a job candidate's race has no bearing on our hiring decisions or that we're immune to conflicts of interest. But psy chological research routinely exposes counter intentional, unconscious biases. The preva-
HARVARD BUSINESS REVIEW· DECEMBER 2003
lence of these biases suggests that even the most well-meaning person unwittingly allows unconscious thoughts and feelings to influence seemingly objective decisions. These flawed judgments are ethically problematic and un dermine managers' fundamental work-to re cruit and retain superior talent, boost the per formance of individuals and teams, and collaborate effectively with partners.
This article explores four related sources of unintentional unethical decision making: im plicit forms of prejudice, bias that favors one's own group, conflict of interest, and a tendency to overclaim credit. Because we are not con sciously aware of these sources of bias, they often cannot be addressed by penalizing peo ple for their bad decisions. Nor are they likely to be corrected through conventional ethics training. Rather, managers must bring a new type of vigilance to bear. To begin, this re quires letting go of the notion that our con scious attitudes always represent what we think they do. It also demands that we aban don our faith in our own objectivity and our
PAGE 3
How (Un)ethical Are You?
Mahzarin R. Banaji is the Richard Clarke Cabot Professor of Social Ethics in the department of psychology at Harvard University and the Carol K Pforzheimer Professor at Harvard's Rad cliffe Institute for Advanced Study in Cambridge, Massachusetts. Max H. Bazerman is the Jesse Isidor Straus Professor of Business Administration at Harvard Business School in Boston. Dolly Chugh, a Harvard Business School MBA, is now a doctoral candi date in Harvard University's joint pro gram in organizational behavior and social psychology.
PAGE 4
ability to be fair; In the following pages, we will offer strategies that can help managers recognize these pervasive, corrosive, uncon scious biases and reduce their impact.
Implicit Prejudice: Bias That Emerges/rom Unconscious Beliefs Most fair-minded people strive to judge others according to their merits, but our research shows how often people instead judge accord ing to unconscious stereotypes and attitudes, or "implicit prejudice." What makes implicit prejudice so common and persistent is that it is rooted in the fundamental mechanics of thought. Early on, we learn to associate things that commonly go together and expect them to inevitably coexist: thunder and rain, for in stance, or gray hair and old age. This skill-to perceive and learn from associations-often serves us well.
But, of course, our associations only reflect approximations of the truth; they are rarely applicable to every encounter. Rain doesn't al ways accompany thunder, and the young can also go gray. Nonetheless, because we auto matically make such associations to help us or ganize our world, we grow to trust them, and they can blind us to those instances in which the associations are not accurate-when they don't align with our expectations.
Because implicit prejudice arises from the ordinary and unconscious tendency to make associations, it is distinct from conscious forms of prejudice, such as overt racism or sexism. This distinction explains why people who are free from conscious prejudice may still harbor biases and act accordingly. Exposed to images that juxtapose black men and violence, portray women as sex objects, imply that the physi cally disabled are mentally weak and the poor are lazy, even the most consciously unbiased person is bound to make biased associations: These associations play out in the workplace just as they do anywhere else.
In the mid-199OS, Tony Greenwald, a profes sor ofpsychology at the University ofWashing ton, developed an experimental tool called the Implicit Association Test (IAT) to study uncon scious bias. A computerized version of the test requires subjects to rapidly classify words and images as "good" or "bad." Using a keyboard, test takers must make split-second "good/bad" distinctions between words like ''love,'' "joy,"
"pain," and "sorrow" and at the same time sort images offaces that are (depending on the bias in question) black or white, young or old, fat or thin, and so on. The test exposes implicit biases by detecting subtle shifts in reaction time that can occur when test takers are required to pair different sets of words and faces. Subjects who consciously believe that they have no negative feelings toward, say, black Americans or the elderly are nevertheless likely to be slower to associate elderly or black faces with the "good" words than they are to associate youthful or white faces with "good" words.
Since 1998, when Greenwald, Brian Nosek, and Mahzarin Banaji put the IAT online, peo ple from around the world have taken over 2.5 million tests, confirming and extending the findings of more traditional laboratory experi ments. Both show implicit biases to be strong and pervasive. (For more information on the IAT, see the sidebar "Are You Biased?").
Biases are also likely to be costly. In con trolled experiments, psychologists Laurie Rud man at Rutgers and Peter Glick at Lawrence University have studied how implicit biases may work to exclude qualified people from certain roles. One set ofexperiments examined the relationship between participants' implicit gender stereotypes and their hiring decisions. Those holding stronger implicit biases were less likely to select a qualified woman who ex hibited stereotypically ''masculine'' personality qualities, such as ambition or independence, for a job requiring stereotypically "feminine" qualities, such as interpersonal skills. Yet they would select a qualified man exhibiting these same qualities. The hirers' biased perception was that the woman was less likely to be s0
cially skilled than the man, though their quali fications were in fact the same. These results suggest that implicit biases may exact costs by subtly excluding qualified people from the very organizations that seek their talents.
Legal cases also reveal the real costs of im plicit biases, both economic and social. Con sider Price Waterhouse v. Hopkins. Despite log ging more billable hours than her peers, bringing in $25 million to the company, and earning the praise of her clients, Ann Hopkins was turned down for partner, and she sued. The details of the case reveal that her evalua tors were explicitly prejudiced in their atti tudes. For example, they had commented that Ann "overcompensated for being a woman"
HARVARD BUSINESS REVIEW· DECEMBER 2003
and needed a "course at charm school." But perhaps more damning from a legal stand point was blunt testimony from experimental research. Testifying as an expert witness for the defense, psychology professor Susan Fiske, now at Princeton University, argued that the potential for biased decision making is inherent in a system in which a person has "solo" sta tus-that is, a system in which the person is the only one of a kind (the only woman, the only African-American, the only person with a disability, and the like). Judge Gerhard Gesell concluded that "a far more subtle process [than the usual discriminatory intent] is in volved" in the assessments made of Ann Hop kins, and she won both in a lower court and in the Supreme Court in what is now a landmark case in discrimination law.
Likewise, the 1999 case of Thomas v.Kodak demonstrates that implicit biases can be the basis for rulings. Here, the court posed the question of "whether the employer con sciously intended to base the evaluations on race or simply did so because of unthinking stereotypes or bias." The court concluded that plaintiffs can indeed challenge "subjective evaluations which could easily mask covert or unconscious race discrimination." Although courts are careful not to assign responsibility easily for unintentional biases, these cases demonstrate the potential for corporate liabil-
Are Yon Biased?
ity that such patterns of behavior could unwit tingly create.
In-Group Favoritism: Bias That Favors Your Group Think about some ofthe favors you have done in recent years, whether for a friend, a rela tive, or a colleague. Have you helped someone get a useful introduction, admission to a school, or a job? Most of us are glad to help out with such favors. Not surprisingly, we tend to do more favors for those we know, and those we know tend to be like ourselves: peo ple who share our nationality, social class, and perhaps religion, race, employer, or alma mater. This all sounds rather innocent. What's wrong with asking your neighbor, the univer sity dean, to meet with a coworker's son? Isn't it just being helpful to recommend a former sorority sister for a job or to talk to your banker cousin when a friend from church gets turned down for a home loan?
Few people set out to exclude anyone through such acts of kindness. But when those in the majority or those in power allocate scarce resources (such as jobs, promotions, and mortgages) to people just like them, they effec tively discriminate against those who are dif ferent from them. Such "in-group favoritism" amounts to giving extra credit for group mem bership. Yet while discriminating against those
How (Un)ethical Are You?
Are you willing to betthatyou feel the same way toward European-Americans as you do toward African-Americans? How about women versus men? Or older people versus younger ones? Think twice before you take that bet. Visit implicit.harvard.edu or www. tolerance.org/hidden_bias to examine your unconscious attitudes.
The Implicit Association Tests available on these sites reveal unconscious beliefs by asking takers to make split-second associa tions between words with positive or nega tive connotations and images representing different types of people. The various tests on these sites expose the differences~rthe
alignment-between test takers' conscious and unconscious attitudes toward people of different races, sexual orientation, or physi cal characteristics. Data gathered from over
HARVARD BUSINESS REVIEW· DECEMBER 2003
2.5 million online tests and further research tells us that unconscious biases are:·
widely prevalent. At least 75% oftest tak ers show an implicit bias favoring the young, the rich, and whites. robust. The mere conscious desi re not to be biased does not eliminate implicit bias. contrary to conscious intention. Al though people tend to report little or no conscious bias against African-Americans, Arabs, Arab-Americans, Jews, gay men, lesbians, or the poor, they show substan tial biases on implicit measures. different in degree depending on group status. Minority group members tend to show less implicit preference for their own groupthan majority group members show for theirs. For example, African Americans report strong preference for
their group on explicit measures but show relatively less implicit preference in the tests. Conversely, white Americans re port a low explicit bias fortheir group but a higher implicit bias.
• consequential. Those who show higher lev els of bias on the IAT are also likely to be have in ways that are more biased in face to-face interactions with members ofthe group they are biased against and in the choices they make, such as hiring decisions.
• costly. Research currently under way in our lab suggests that implicit bias gener ates a "stereotype tax"-negotiators leave money on the table because biases cause them to miss opportunities to learn about their opponent and thus cre ate additional value through mutually beneficial trade-offs.
PAGES
How (Un)ethical Are You?
Would you be willing to
risk being in the group
disadvantaged by your
own decision?
PAGE 6
who are different is considered unethical, help ing people close to us is often viewed favor ably. Think about the number of companies that explicitly encourage this by offering hir ing bonuses to employees who refer their friends for job opportunities.
But consider the finding that banks in the United States are more likely to deny a mort gage application from a black person than from a white person, even when the applicants are equally qualified The common view has been that banks are hostile to African-Americans. While this may be true of some banks and some loan officers, social psychologist David Messick has argued that in-group favoritism is more likely to be at the root of such discriminatory lending. A white loan officer may feel hopeful or lenient toward an unqualified white appli cant while following the bank's lending stan dards strictly with an unqualified black appli cant. In denying the black applicant's mortgage, the loan officer may not be expressing hostility toward blacks so much as favoritism toward whites. It's a subtle but crucial distinction.
The ethical cost is clear and should be rea son enough to address the problem. But such inadvertent bias produces an additional effect: It erodes the bottom line. Lenders who dis criminate in this way, for example, incur bad debt costs they could have avoided if their lending decisions were more objective. They also may find themselves exposed to damaging publicity or discrimination lawsuits if the skewed lending pattern is publicly revealed. In a different context, companies may pay a real cost for marginal hires who wouldn't have made the grade but for the sympathetic hiring manager swayed by in-group favoritism.
In-group favoritism is tenacious when mem bership confers clear advantages, as it does, for instance, among whites and other dominant social groups. (It may be weaker or absent among people whose group membership of fers little societal advantage.) Thus for a wide array of managerial tasks-from hiring, firing, and promoting to contracting services and forming partnerships-qualified minority can didates are subtly and unconsciously discrimi nated against, sometimes simply because they are in the minority: There are not enough of them to counter the propensity for in-group fa voritism in the majority.
Overclaiming Credit Bias That Favors You It's only natural for successful people to hold positive views about themselves. But many studies show that the majority of people con sider themselves above average on a host of measures, from intelligence to driving ability. Business executives are no exception. We tend to overrate our individual contribution to groups, which, bluntly put, tends to lead to an overblown sense of entitlement. We become the unabashed, repeated beneficiaries of this unconscious bias, and the more we think only of our own contributions, the less fairly we judge others with whom we work.
Lab research demonstrates this most per sonal of biases. At Harvard, Eugene Caruso, Nick Epley, and Max Bazerman recently asked MBA students in study groups to estimate what portion of their group's work each had done. The sum of the contribution by all mem bers, of course, must add up to 100%. But the researchers found that the totals for each study group averaged 139%. In a related study, Caruso and his colleagues uncovered rampant overestimates by academic authors of their contribution to shared research projects. Sadly, but not surprisingly, the more the sum of the total estimated group effort exceeded 100% (in other words, the more credit each person claimed), the less the parties wanted to collaborate in the future.
Likewise in business, claiming too much credit can destabilize alliances. When each party in a strategic partnership claims too
much credit for its own contribution and be comes skeptical about whether the other is doing its fair share, they both tend to reduce their contributions to compensate. This has ob vious repercussions for the joint venture's per formance.
Unconscious overclaiming can be expected to reduce the performance and longevity of groups. within organizations, just as it dimin ished the academic authors' willingness to col laborate. It can also take a toll on employee commitment. Think about how employees perceive raises. Most are not so different from the children at Lake Wobegon, believing that they, too, rank in the upper half of their peer group. But many necessarily get pay increases that are below the average. If an employee learns of a colleague's greater compensation while honestly believing that he himself is
HARVARD BUSINESS REVIEW· DECEMBER 2003
more deserving-resentment may be natural. At best, his resentment might translate into re duced commitment and performance. At worst, he may leave the organization that, it seems, doesn't appreciate his contribution.
Conflict of Interest Bias That Favors Those Who Can Benefit You Everyone lmows that conflict of interest can lead to intentionally corrupt behavior. But nu merous psychological experiments show how powerfully such conflicts can unintentionally skew decision making. (For an examination of the evidence in one business arena, see Max Bazerman, George Loewenstein, and Don Moore's November 2002 HBR article, "Why Good Accountants Do Bad Audits.") These ex periments suggest that the work world is rife with situations in which such conflicts lead honest, ethical professionals to unconsciously make unsound and unethical recommenda tions.
Physicians, for instance, face conflicts of in terest when they accept payment for referring patients into clinical trials. While, surely, most physicians consciously believe that their refer rals are the patient's best clinical option, how do they lmow that the promise ofpayment did not skew their decisions? Similarly, many law yers earn fees based on their clients' awards or settlements. Since going to trial is expensive and uncertain, settling out of court is often an attractive option for the lawyer. Attorneys may consciously believe that settling is in their clients' best interests. But how can they be ob jective, unbiased judges under these circum stances?
Research done with brokerage house ana lysts demonstrates how conflict of interest can unconsciously distort decision making. A sur vey of analysts conducted by the financial re search service First Call showed that during a period in 2000 when the Nasdaq dropped 60%, fully 99% of brokerage analysts' client recommendations remained "strong buy," ''buy,'' or ''hold.'' What accounts for this dis crepancy between what was happening and what was recommended? The answer may lie in a system that fosters conflicts of interest. A portion of analysts' pay is based on brokerage firm revenues. Some firms even tie analysts' compensation to the amount of business the analysts bring in from clients, giving analysts
MARVARD BUSINESS REVIEW· DECEMBER 2003
an obvious incentive to prolong and extend their relationships with clients. But to assume that during this Nasdaq free fall all brokerage house analysts were consciously corrupt, milk ing their clients to exploit this incentive sys tem, defies common sense. Surely there were some bad apples. But how much more likely it is that most of these analysts believed their recommendations were sound and in their cli ents' best interests. What many didn't appreci ate was that the built-in conflict of interest in their compensation incentives made it impos sible for them to see the implicit bias in their own flawed recommendations.
Trying Harder Isn't Enough As companies keep collapsing into financial scandal and ruin, corporations are responding with ethics-training programs for managers, and many of the world's leading business schools have created new courses and chaired professorships in ethics. Many of these efforts focus on teaching broad principles of moral philosophy to help managers understand the ethical challenges they face.
We applaud these efforts, but we doubt that a well-intentioned, just-try-harder approach will fundamentally improve the quality of ex ecutives' decision making. To do that, ethics training must be broadened to include what is now lmown about how our minds work and must expose managers directly to the uncon scious mechanisms that underlie biased deci sion making. And it must provide managers with exercises and interventions that can root out the biases that lead to bad decisions.
Managers can make wiser, more ethical de cisions if they become mindful of their uncon scious biases. But how can we get at something outside our conscious awareness? By bringing the conscious mind to bear. Just as the driver of a misaligned car deliberately counteracts its pull, so can managers develop conscious strate gies to counteract the pull oftheir unconscious biases. What's required is vigilance-continual awareness of the forces that can cause decision making to veer from its intended course and continual adjustments to counteract them. Those adjustments fall into three general cate gories: collecting data, shaping the environ ment, and broadening the decision-making process.
Collect data. The first step to reducing un conscious bias is to collect data to reveal its
How (Un)ethical Are You?
PAGE 7
How (Un)ethical Are You?
Areyour company's high
achievers all castfrom
the same mold?
PAGE 8
presence. Often, the data will be counterintui tive. Consider many people's surprise to learn of their own gender and racial biases on the lAT. Why the surprise? Because most of us trust the "statistics" our intuition provides. Better data are easily, but rarely, collected. One way to get those data is to examine our decisions in a systematic way.
Remember the MBA study groups whose participants overestimated their individual contributions to the group effort so that the to tals averaged 139%? When the researchers asked group members to estimate what each of the other members' contributions were be fore claiming their own, the total fell to 121%. The tendency to claim too much credit still persisted, but this strategy of "unpacking" the work reduced the magnitude of the bias. In en vironments characterized by "I deserve more than you're giving me" claims, merely asking team members to unpack the contributions of others before claiming their own share of the pot usually aligns claims more closely with what's actually deserved. As this example dem onstrates, such systematic audits of both indi vidual and group decision-making processes can occur even as the decisions are being made.
Unpacking is a simple strategy that manag ers should routinely use to evaluate the fair ness of their own claims within the organiza tion. But they can also apply it in any situation where team members or subordinates may be overclaiming. For example, in explaining a raise that an employee feels is inadequate, a manager should ask the subordinate not what he thinks he alone deserves but what he con siders an appropriate raise after taking into ac count each coworker's contribution and the pool available for pay increases. Similarly, when an individual feels she's doing more than her fair share of a team's work, asking her to
consider other people's efforts before estimat ingher own can help align her perception with reality, restore her commitment, and reduce a skewed sense of entitlement.
Taking the lAT is another valuable strategy for collecting data,. We recommend that you and others in your organization use the test to expose your own implicit biases. But one word of warning: Because the test is an educational and research too~ not a selection or evaluation tool, it is critical that you consider your results and others' to be private information. Simply
knowing the magnitude and pervasiveness of your own biases can help direct your attention to areas of decision making that are in need of careful examination and reconsideration. For example, a manager whose testing reveals a bias toward certain groups ought to examine her hiring practices to see if she has indeed been disproportionately favoring those groups. But because so many people harbor such bi ases, they can also be generally acknowledged, and that knowledge can be used as the basis for changing the way decisions are made. It is important to guard against using pervasiveness to justify complacency and inaction: Pervasive ness ofbias is not a mark of its appropriateness any more than poor eyesight is considered so ordinary a condition that it does not require corrective lenses.
Shape your environment. Research shows that implicit attitudes can be shaped by exter nal cues in the environment. For example, Curtis Hardin and colleagues at UCLA used the lAT to study whether subjects'implicit race bias would be affected if the test was ad ministered by a black investigator. One group ofstudents took the test under the guidance of a white experimenter; another group took the test with a black experimenter. The mere pres ence of a black experimenter, Hardin found, reduced the level of subjects' implicit anti black bias on the lAT. Numerous similar stud ies have shown similar effects with other so cial groups. What accounts for such shifts? We can speculate that experimenters in class rooms are assumed to be competent, in charge, and authoritative. Subjects guided by a black experimenter attribute these positive characteristics to that person, and then per haps to the group as a whole. These findings suggest that one remedy for implicit bias is to expose oneself to images and social environ ments that challenge stereotypes.
We know of a judge whose court is located in a predominantly African-American neigh borhood. Because of the crime and arrest pat terns in the community, most people the judge sentences are black. The judge confronted a paradox. On the one hand, she took a judicial oath to be objective and egalitarian, and in deed she consciously believed that her deci sions were unbiased. On the other hand, every day she was exposed to an environment that reinforced the association between black men and crime. Although she consciously rejected
HARVARD BUSINESS REVIEW· DECEMBER 2003
racial stereotypes, she suspected that she har bored unconscious prejudices merely from working in a segregated world. Immersed in this environment each day, she wondered if it was possible to give the defendants a fair hear ing.
Rather than allow her environment to rein force a bias, the judge created an alternative environment. She spent a vacation week sit ting in a fellow judge's court in a neighbor hood where the criminals being tried were pre dominantly white. Case after case challenged the stereotype ofblacks as criminal and whites as law abiding and so challenged any bias against blacks that she might have harbored.
Think about the possibly biased associations your workplace fosters. Is there, perhaps, a "wall of fame" with pictures of high achievers all cast from the same mold? Are certain types of managers invariably promoted? Do people overuse certain analogies drawn from stere<r typical or narrow domains of knowledge (sports metaphors, for instance, or cooking terms)? Managers can audit their organization to uncover such patterns or cues that unwit tingly lead to stereotypical associations.
If an audit reveals that the environment may be promoting unconscious biased or un ethical behavior, consider creating counter vailing experiences, as the judge did. For exam ple, if your department reinforces the stereotype of men as naturally dominant in a hierarchy (most managers are male, and most assistants are female), find a department with women in leadership positions and set up a shadow program. Both groups will benefit from the exchange of
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.