United States Securities and Exchange Commission
Washington, D.C. 20549
NOTICE OF EXEMPT SOLICITATION
Pursuant to Rule 14a-103
Name of the Registrant: Amazon.com, Inc.
Name of persons relying on exemption: John Harrington, Harrington Investments, Inc.
Address of persons relying on exemption: Harrington Investments, Inc., 1001 Second Street, Suite 325, Napa, CA 94559
Written materials are submitted pursuant to Rule 14a-6(g) (1) promulgated under the Securities Exchange Act of 1934. Submission is not required of this filer under the terms of the Rule, but is made voluntarily in the interest of public disclosure and consideration of these important issues.
IMPORTANT PROXY VOTING MATERIAL
Shareholder Rebuttal to Amazon.com, Inc.
Requesting an Independent Study of Rekognition and Report to Shareholders
Harrington Investments, Inc. urges you to vote FOR Item #7 on the proxy, the Shareholder Proposal Requesting an Independent Study of Rekognition and Report to Shareholders, at the Amazon.com, Inc. Annual Meeting on May 22, 2019.
Support for this resolution is warranted because:
|1.||Investors are lacking disclosure on how the Company assesses and manages the financial, reputational, regulatory, legal, and human capital management risks posed by the sale of facial recognition technology to government.|
|2.||Rekognition may adversely impact privacy rights, freedom of association and assembly, and the right to non-discrimination for people of color, immigrants, and activists in the United States and those living under authoritarian or repressive governments outside the United States. Investors need more information from the Company, including independent stakeholder input, so they can be assured that the Company’s policies and practices are sufficient to protect shareholders from the risks posed by sales of Rekognition to government.|
|3.||Amazon lags peers in governance, oversight and management practices to assess and address the ethical, legal, and reputational risks associated with the use of its products. Investors are requesting that Amazon’s board fulfill its fiduciary duty of care to conduct the necessary due diligence evaluating privacy and civil rights risk, financial and operational risks, and human rights risks around the sale of Rekognition to government as a basic oversight and risk management practice.|
SUMMARY OF RESOLUTION
Resolved: Shareholders request the Board of Directors commission an independent study of Rekognition and report to shareholders regarding:
|●||The extent to which such technology may endanger, threaten, or violate privacy and/or civil rights, and unfairly or disproportionately target or surveil people of color, immigrants and activists in the United States;|
|●||The extent to which such technologies may be marketed and sold to authoritarian or repressive foreign governments, identified by the United States Department of State Country Reports on Human Rights Practices;|
|●||The financial or operational risks associated with these human rights issues;|
The report should be produced at reasonable expense, exclude proprietary or legally privileged information, and be published no later than September 1, 2019.
We believe the Board of Directors’ fiduciary duty of care extends to thoroughly evaluating the impacts on reputation and shareholder value, of any surveillance technology our Company produces or markets on which significant concerns are raised regarding the danger to civil and privacy rights of customers and other stakeholders. The recent failures of Facebook to engage in sufficient content and privacy management, and the resulting economic impacts to that company should be taken as sufficient warning: it could happen to Amazon.
ARGUMENTS IN FAVOR OF THE RESOLUTION ON RISKS OF SALES OF FACIAL RECOGNITION SOFTWARE
1. Amazon is exposed to financial, reputational, regulatory, legal, and human capital management risk due to its sales of Rekognition to government.
Investors are concerned by sales of Rekognition to government in the United States and globally because this highly controversial technology poses tremendous risks involving civil and human rights and government surveillance.
Rekognition is a facial recognition technology tool that is being deployed for facial recognition and recognition of other imagery in photographic and video recordings as one application available through Amazon Web Services (AWS). AWS offers cloud-based products including “compute, storage, databases, analytics, networking, mobile, developer tools, management tools, IoT, security and enterprise applications.”1 As promoted on the AWS web site, Rekognition is marketed in a suite of apps; it is offered “free” to new AWS subscribers.2 AWS’s apparent intention is to offer apps that will induce subscribers to use more of AWS’s cloud services.
AWS is by far the largest provider of Internet “cloud” services in the world, with 2018 revenue of $25.6 billion.3 AWS currently provides cloud services for all 17 United States intelligence agencies, as well as for government agencies internationally4 in the United Kingdom, Italy, Singapore, Belgium, Canada, and Turkey. The FBI is testing Amazon’s Rekognition5, and Amazon is attempting to sell Rekognition to Immigration and Customs Enforcement (ICE)6. Florida’s Orlando Police Department (OPD) and the city of Orlando, in July 2018, announced it would continue to use Rekognition after an initial pilot program.7 Amazon lists the Washington County Sheriff’s Office in Oregon, WA as a Rekognition customer.8
According to the United Nations Guiding Principles on Business and Human Rights (Principle 17),9 Amazon has a responsibility for the use of its products, and human rights due diligence should cover “impacts that the business enterprise may cause or contribute to through its own activities, or which may be directly linked to its operations, products or services by its business relationships.”
In the hands of government, Rekognition threatens civil liberties and civil rights for all members of society, and especially for people who are more likely to be surveilled, profiled and targeted, including people of color and immigrants. Clare Garvie, of Georgetown Law’s Center on Privacy & Technology, has written10: “A mistake by a face-scanning surveillance system on a body camera could be lethal. An officer, alerted to a potential threat to public safety or to himself, must, in an instant, decide whether to draw his weapon. A false alert places an innocent person in those crosshairs.”
The American Civil Liberties Union (ACLU)11 has noted that facial recognition technology threatens to “chill First Amendment-protected activity like engaging in protest or practicing religion, and it can be used to subject immigrants to further abuse from the government.”
Amazon has repeatedly responded to the controversy in defense of Rekognition. In February 2019, AWS’s Global Public Policy VP published a blog post outlining key areas where the Company was taking the unusual step of calling for enhanced regulatory policies surrounding facial recognition technology, especially when used by police.12 Amazon’s blog post is a clear recognition that the Company has placed onto the market a technology that is most inadequately controlled and regulated. Public expectations of privacy and respect for civil rights are not presently protected, and investors need greater disclosure from the Company about how civil liberties, civil rights and human rights risks are being managed.
While Amazon has been responsive to public controversy, the Company has been less responsive to over a dozen shareholders who expressed concerns about Rekognition in June 2018. On June 15th, 2018, the lead proponent of this resolution was also the lead signatory on an initial letter by 19 financial services firms holding Amazon stock, including wealth management companies and registered investment advisors, raising strong objections to the introduction of the Company’s facial recognition technology. Not only did Amazon not reply to the initial communication, there was no acknowledgment of receipt of the communication. Such initial issues raised specifically included “substantial risks for our Company negatively impacting our Company’s stock valuation and increasing financial risk for shareholders.” The June 15th, 2018 letter also stated:
“…we have seen no evidence of our Board of Directors conducting fiduciary oversight on how Rekognition may or may not, should or should not, be deployed. The recent experience and scrutiny of Facebook demonstrated the degree to which these new issues may undermine company value as the detrimental impacts on society become clear. While Rekognition may be intended to enhance some law enforcement activity, we are deeply concerned it may ultimately violate civil and human rights.”
While the accuracy of Rekognition’s technology is a concern — multiple studies by M.I.T. researchers and the ACLU have found that racial and gender bias is embedded in Rekognition13 — there are more fundamental issues, including whether facial recognition should be deployed at all because of the role the technology plays in enabling ubiquitous government surveillance.
Indeed, scientific and academic research suggests that over time the societal risks embedded in a technology such as Rekognition may not decline but may instead increase quickly and dramatically as Rekognition is deployed more widely. In April 2019, 25 prominent artificial-intelligence researchers — including experts at Amazon competitors Google, Facebook and Microsoft, and a winner of the prestigious Turing Award — called on Amazon to stop selling Rekognition to government.14 “There are no laws or required standards to ensure that Rekognition is used in a manner that does not infringe on civil liberties,” the AI researchers wrote.
This is particularly concerning for those whose human rights are neglected by autocratic government regimes. Human rights organizations cite the deployment of facial recognition in China, where, according to one expert, “surveillance technologies are giving the government a sense that it can finally achieve the level of control over people’s lives that it aspires to.”15
Scholars Woodrow Hartzog and Evan Selinger have written16: “We believe facial recognition technology is the most uniquely dangerous surveillance mechanism ever invented...when technologies become so dangerous, and the harm-to-benefit ratio becomes so imbalanced, categorical bans are worth considering.”
16 https://medium.com/s/story/facial-recognition-is-the-perfect-tool-for-oppression-bc2a08f0fe66 August 2, 2018
Amazon is reported to be the second most trusted institution in the United States17, and sales of Rekognition threaten this extraordinary relationship of trust on which the Company relies for success. Given the serious potential risks outlined above, manifestation of these potential harms could result in a negative financial impact on Amazon, through, for example, a decrease in sales or contracts due to diminished consumer trust that negatively impacts the demand for Amazon products and services and increases the cost of business. Research shows that being linked to adverse human rights impacts can negatively impact a company’s reputation and harm its potential to secure future contracts. A recent study quantifying the material impact of the reputational harm for Energy Transfer Partners (ETP) due to its inadequate stakeholder consultation and public pressure from the Standing Rock Sioux Tribe to the Dakota Access Pipeline demonstrated that social tension negatively impacted stock price and ETP underperformed relative to market expectations as a result of the controversy. Growing and widespread social controversy and public pressure surrounding Rekognition may negatively impact Amazon’s financial performance.
Privacy as a material financial risk
Rekognition, as a potential tool for surveillance when sold to government, already presents material issues for Amazon regarding consumer privacy. According to the Sustainability Accounting Standards Board (SASB)18, which identifies environmental, social and governance factors most likely to materially impact the financial condition or operating performance of companies in an industry, consumer privacy is likely to be a material issue for companies operating in technology and communications. This includes the management of risks related to the use of personally identifiable information (such as biometric data collected through facial recognition technology), social issues that may arise from a company’s collecting information (such as public controversies surrounding the collection of facial imagery), and managing evolving regulation (such as regulation around the commercial use of facial recognition).
Reputational risk posed by sales of Rekognition to government
The public controversy surrounding Rekognition presents critical risks to the Company’s reputation, including the willingness of consumers to trust the company to safeguard their privacy and civil and human rights because storing your data on their cloud requires that you trust them to keep your data safe. Therefore, consumer trust is one of Amazon’s most important intangible assets. This comes as Amazon, with its leadership position in the technology sector, confronts growing public criticism regarding the role of technology companies, including Amazon.com, and its products and services, in societies and economies around the world. Thus, proponents believe Rekognition may threaten Amazon’s long-term prospects.
Amazon is the subject of mounting public controversy because of this product. In May 2018, the ACLU, along with a coalition of civil rights organizations, sent a public letter to the Company demanding that it stop selling Rekognition to government.19 In January 2019, over 85 activist groups – including the ACLU, the National Lawyers Guild chapters, and Freedom of the Press Foundation — signed an open letter expressing their concern for how Rekognition technology threatens community safety, privacy, and human rights.20
19 Iqra Asghar and Kade Crockford, Amazon Should Follow Google’s Lead and Stop Selling Face Surveillance Tech to Cops, PRIVACY SOS (June 2, 2018), https://privacysos.org/blog/amazon-follow-googles-lead-stop-selling-facesurveillance-tech-cops/.
20 Open Letter to Amazon Against Police and Government Use of Rekognition, International Committee for Robot Arms Control, https://www.icrac.net/open-letter-to-amazon-against-police-and-government-use-of-rekognition/.
The increasing controversy surrounding Rekognition highlights the fragile relationship of trust between the Company and its consumers, employees, and the public at large. A number of the Company’s products – Alexa, Ring, and Eero — could face a spillover effect if Amazon’s ability to maintain the trust of customers is breached by concerns about privacy and surveillance. Moreover, in addition to the Company’s unique exposure to risk by virtue of it being a business operating in the technology sector, it also has a product pipeline and pending patent applications which demonstrate the trajectory of the Company will confront just such concerns. For example, two facial recognition-related patent applications filed by the Company feature a technology that could use multiple cameras to create a composite image of a person’s partially seen face, and could then automatically alert law enforcement if a “suspicious” person or known criminal is in view of Ring’s cameras.21 Only this month, multiple media outlets sparked controversy with reports that Amazon employees were “listening” to recordings from Alexa devices in the home.22
A degraded reputation has also had a negative impact on the company’s social license to operate. In 2019, Amazon’s reputation fell in the Axios Harris Poll 100 Reputation Ratings23, which ranks the reputation of the most visible companies in the U.S:
“For three years, America voted Amazon its top company for corporate reputation. ...But suddenly, amid a high profile search and last minute cancellation for HQ2, and the ensuing fallout with Alexandria Ocasio-Cortez and company, America still loves its smiling boxes, but are beginning to grow uneasy with Amazon’s reach and power…while the public ranked Amazon #2 for Products & Services, they ranked Amazon much lower for ethical attributes like “speaking out on social issues that are important to me” (#36), “maintains high ethical standards” (#16) , and “looks like a good company to work for” (#12).”
The deal for a major new HQ2 facility in New York City recently fell through. While Rekognition was not the main issue raised in the controversial discussions, it was cited prominently by a number of key community leaders, including New York City Council members, U.S. Representative Alexandria Ocasio-Cortez, and numerous immigrant rights groups concerned by the threat to immigrant communities posed by facial recognition technology in the hands of government, including ICE.
Regulatory risk posed by sales of Rekognition to government
Independently assessing Rekognition and the risks it poses to civil and human rights, would not only protects stakeholders who may be harmed by Rekognition; it would also help the Company minimize expensive legal, operational, and reputational risks, benefiting Amazon’s – and in turn, shareholders’ – bottom lines.
Amazon’s sales of Rekognition to government faces regulatory risk and an environment of uncertainty. In December 2018, the AI Now Institute at New York University warned the “urgent need” for stricter regulation of facial recognition technology.24
Members of Congress have written multiple letters to CEO Jeff Bezos expressing concerns about Amazon’s Rekognition product.25
24 AI Now Report 2018, AI Now Institute at New York University (December 2018), https://ainowinstitute.org/AI_Now_2018_Report.pdf.
In July 2018, five U.S. Senators called on the federal government’s General Accounting Office (GAO) to investigate the commercial and government use, and potential abuse, of facial recognition technology.26 This month, GAO wrote27 to the Dept. of Justice saying the agency had failed to appease GAO’s stated concerns about FBI use of face recognition technology (the FBI is piloting Rekognition).
In March 2019, two U.S. Senators introduced the Commercial Facial Recognition Privacy Act of 2019 that would prohibit commercial users of facial recognition technology “from collecting and re-sharing data for identifying or tracking consumers without their consent.”28
In the United States, legislation that would ban government use of facial recognition technology has been recently introduced in the states of Massachusetts29 and Washington30 and in the city of San Francisco.31
The Company has itself noted the potential materiality of risks associated with a regulatory framework that has not yet caught up with such emerging technologies in its discussion of risk factors in its 2019 10-K32:
“It is not clear how existing laws governing issues such as property ownership, libel, data protection, and personal privacy apply to the Internet, e-commerce, digital content, web services, and artificial intelligence technologies and services. Unfavorable regulations, laws, and decisions interpreting or applying those laws and regulations could diminish the demand for, or availability of, our products and services and increase our cost of doing business.”
Legal risk posed by sales of Rekognition to government
Meanwhile, as Amazon continues to sell Rekognition in a largely unregulated environment, consumers as well as local, state and federal government actors may escalate legal challenges to the company to establish stronger precedents around consumer privacy, posing the risk of major fines charged for regulatory violations and litigation.
The Illinois Supreme Court in January 2019 expanded the potential liability for companies that sell facial-recognition technology under the state’s Biometric Information Privacy Act by ruling that plaintiffs only need to prove technical violations rather than actual injury or damages, further clearing the way for potential lawsuits involving facial technology.
Investors should have disclosure from the Company about how Amazon assesses legal risk. Supporting this proposal and conducting a study of Rekognition and its potential impact, as outlined in the proposal, would allow the board to properly understand and convey to shareholders the extent to which these sales would present legal risks, and the company’s plan to mitigate those risks, before it is too late.
32 Amazon 10K for 2019 - Risk Factors discussion.
Human capital management risk posed by sales of Rekognition to government
Amazon’s status as an employer of choice, and Amazon’s ability to be consistent with the values of its employees, is being undermined by the Company’s internal management and potential sale of surveillance technologies, including Rekognition. The company’s lack of transparency about the nature of its sales and failure to respond to employees who do not want to use their time and talent in support of selling surveillance technology to government may be hurting Amazon’s ability to attract, hire, retain, and maintain good relations with employees. Amazon risks losing top new talent as millennials in the workforce seek employment from companies who can match their values. An Accenture report found that 70% of 2016 college graduates prefer an employer who offers a positive social atmosphere, and companies that “do good”, over a higher salary; 92% said it was important their employer demonstrate social responsibility.33
The Company’s attempt this past June to sell Rekognition to ICE fueled a backlash among Amazon employees, 450 of whom signed an open letter to the Company in protest, saying:
“...We learn from history, and we understand how IBM’s systems were employed in the 1940s to help Hitler. IBM did not take responsibility then, and by the time their role was understood, it was too late. We will not let that happen again. The time to act is now. We call on you to: Stop selling facial recognition services to law enforcement.”34
In addition to products that present these concerns, surveillance issues raised by Rekognition are mirrored within the Company itself in how it tracks its own employees. The Company was granted two patents recently that would allow it to track its workers’ hand movements through wristbands. By tracking detailed movements through artificial intelligence technologies, the Company can also obtain and record highly private information, such as when an employee takes a bathroom break. This degree of monitoring adds Fourth Amendment privacy intrusion concerns to the mix of a work culture already criticized for pressuring employees to work long hours and perform above all else, and investors need more information about how surveillance issues may be threatening the long-term performance and health of the company overall.
2. Investors need more information from the Company so they can be assured that the Company’s policies and practices are sufficient to protect shareholders and the public from the risks posed by sales of Rekognition to government.
Amazon’s existing policy to mitigate risks of sales of Rekognition are insufficient and ineffectual
In its Opposition Statement, Amazon argued that the AWS Acceptable Use Policy protects against risks posed by illegal or harmful use of Rekognition. Amazon asserts in the Opposition Statement that customers who gain access to deploy Rekognition must comply with its Acceptable Use Policy which prohibits the use of its products “for any illegal, harmful, fraudulent, infringing or offensive use,” including the violation of laws related to privacy, discrimination and civil rights, and that Amazon has not received a single report of Rekognition being used in a harmful manner as posited in the proposal. Recent experience, however, demonstrates that the Acceptable Use Policy is no guarantee that the AWS platform will not be used in a harmful manner. According to multiple media reports35, a Mexico-based news site used Amazon servers to openly store 540 million records on Facebook users, including identification numbers, comments, reactions and account names. The problem was detected by researchers at an independent security firm which reportedly sent emails to Amazon “over many months” to alert it to the problem. Amazon reportedly failed to respond to those messages.
Amazon itself has shifted its stated position that existing company policies protect users and the Company from significant risk. For example, in the face of the groundswell of concern by NGOs and civil liberties experts, a leading company spokesman, as discussed above and in the Opposition Statement to the Proposal, tempered the Company’s position, calling for government regulation and “dialogue” among stakeholders, including shareholders. It is frankly difficult to understand why the company is opposing the current Proposal, as the Proposal itself seems to provide the opportunity to fulfill exactly the terms of Amazon’s own invitation for “open, honest, and earnest dialogue” with key stakeholders.
Shareholders are concerned with violations by government customers particularly. Currently, the FBI is petitioning for face recognition systems to be exempt from the prohibitions on tracking people during the exercise of their right to free speech.36 In addition, long-standing rules that have precluded the FBI and Department of Homeland Security from tracking the identity of individuals during the exercise of free speech appear to be at risk.37 Therefore, the Acceptable Use Policy provisions are far from self-executing, and as the example of the FBI request for waiver of normal civil liberties protections demonstrates, the evidence shows that the Acceptable Use Policy is ineffective at protecting people from the harms of surveillance technology in the hands of government.
Insufficient board oversight of risks related to Rekognition
Governance of the significant risks presented by Rekognition is lacking. Instead of company oversight, it appears that the responsibility falls to the customer for ensuring that the technology is not used in illegal or harmful ways that threaten shareholder value. This is too significant of an issue for the company to abdicate its responsibility to protect human rights. The Proponents are concerned that the Amazon board is not equipped to adequately identify and assess the risks posed by Rekognition. The directors overall lack expertise that would give them the background or tools to assess the human rights impacts of machine learning, artificial intelligence, and the primary technologies behind a product like Rekognition. One possible exception is director Daniel Huttenlocher, who holds a Ph.D in Computer Science from MIT and hails an interest in "emerging technologies." The board also lacks any governance committee tasked with overseeing these risks. Therefore, an evaluation conducted by an independent group of experts of the risks posed by sales of Rekognition to government is necessary to assure investors that the Company is adequately assessing and mitigating risks posed by sales of Rekognition to government.
3. Amazon lags peers in governance, oversight and management practices to assess and address the ethical, legal, and reputational risks associated with the use of its products.
Amazon’s peer companies in the technology sector have implemented stronger governance mechanisms to oversee human rights and ethics concerns posed by the sale of technologies. Amazon maintains three board committees: a Leadership Development and Compensation Committee, an Audit Committee, and a Nominating and Corporate Governance Committee. None of the committee charters make mention of human rights concerns or ethics in any manner. For a company as large as Amazon with business interests with implications in as many areas of society as Amazon, this is an especially egregious omission of purpose.
Alphabet, parent of Google and competitor to Amazon in the technology field, in December 2018 said it has opted to not yet offer facial recognition technology: “...unlike some other companies, Google Cloud has chosen not to offer general-purpose facial recognition APIs before working through important technology and policy questions.”38 This is clearly a reference to Amazon, which faced extensive public criticism and reputational damage during this time period for its pursuit of sales of Rekognition. Google dropped out of the bidding for a $10 billion Department of Defense cloud contract, JEDI, due to factors including ethical concerns.39
Brad Smith, president of Microsoft, called in June 2018 for government regulation of facial recognition technology. At the same time, shareholders were publicizing a letter to Amazon asking the company to halt sales of its facial recognition product and advocate researchers were making known errors in the quality of Amazon’s product. Google also spoke up at this time to announce a formal review structure to assess new projects, products and deals called Responsible AI Practices, a set of quarterly-updated technical recommendations and results to share with the wider AI ecosystem.
Similarly, in May 2018 Facebook announced an “AI ethics team” in order to “ensure that its artificial intelligence systems make decisions as ethically as possible, without biases."
In the March of the same year, Microsoft formed the AI and Ethics in Engineering and Research (AETHER) Committee bringing together senior leaders from across the company to focus on proactive formulation of internal policies and how to respond to specific issues in a responsible way. Amazon has no similar effort. Microsoft’s committee however, set clear guidelines to identify, study and recommend policies, procedures, and best practices on questions, challenges, and opportunities coming to the fore on influences of AI on people and society, and invest in strategies and tools for detecting and addressing bias in AI systems and implementing new requirements established by the GDPR. Microsoft’s cleaner reputation in this area is a result of these measures.
Even without committees, companies demonstrate their commitment to establish an ethical framework with new technologies by taking measures to call for their regulation, refusing to bid on or pursue contracts with the government, or establishing specialty leadership roles. This month, Microsoft turned down a sale of facial recognition software to a CA law enforcement agency, citing human rights concerns.40 Salesforce, for example, employs a chief ethical and humane use officer. In all three of these proposed avenues, Amazon goes against the grain.
Proponents of the resolution urge investors to vote in favor of the shareholder proposal Requesting a report on the impact of government use of Rekognition at Amazon.com, Inc. because:
|1.||Amazon is exposed to financial, reputational, regulatory, legal, and human capital risk due to its sales of Rekognition to government.|
|2.||An independent evaluation of the impact of sales of Rekognition to government on privacy, civil and human rights is needed to assure investors that the Company’s existing policies protect shareholders and the public from risk.|
|3.||Amazon lags peers in governance, oversight and management practices to assess and address the ethical, legal, and reputational risks, and the Company must fulfill its fiduciary duty of care around human rights risk by conducting an independent study of Rekognition.|
The proponents urge you to vote FOR Item #7 on the proxy, the Shareholder Proposal Requesting an Independent Study of Rekognition and Report to Shareholders at the Amazon.com, Inc. Annual Meeting on May 22, 2019.
THE FOREGOING INFORMATION MAY BE DISSEMINATED TO SHAREHOLDERS VIA TELEPHONE, U.S. MAIL, E-MAIL, CERTAIN WEBSITES AND CERTAIN SOCIAL MEDIA VENUES, AND SHOULD NOT BE CONSTRUED AS INVESTMENT ADVICE OR AS A SOLICITATION OF AUTHORITY TO VOTE YOUR PROXY. THE COST OF DISSEMINATING THE FOREGOING INFORMATION TO SHAREHOLDERS IS BEING BORNE ENTIRELY BY ONE OR MORE OF THE CO-FILERS.
PROXY CARDS WILL NOT BE ACCEPTED BY ANY CO-FILER. PLEASE DO NOT SEND YOUR PROXY TO ANY CO-FILER.
TO VOTE YOUR PROXY, PLEASE FOLLOW THE INSTRUCTIONS ON YOUR PROXY CARD.
For questions regarding Amazon, Inc. – Item #7 – The Shareholder Proposal Requesting an Independent Study of Rekognition and Report to Shareholders at Amazon, Inc. submitted by Harrington Investments, Inc., please contact John Harrington, Harrington Investments, Inc. at 800-788-0154 or via email at firstname.lastname@example.org.