User:5nikpohd/sandbox

The University of Waterloo vending machine facial recognition system error controversy occurred on February 10, 2024. Reddit user SquidKid47 posted a photo of the error to r/uwaterloo, which read “Invenda.Vending.FacialRecognition.App.exe — Application Error.”

Other Reddit users speculated the privacy issue caused by the vending machine technology. River Stanley, a student at the university, investigated the machines in volume 154, issue 3 of the publication, mathNEWS. The story gained traction on Reddit and Twitter, quickly being picked up by local and global news organizations on the topic of privacy.

Comments to Stanley and other news reporters from the producers of the “smart” vending machine, Invenda Group, and their Canadian partner, Adaria Vending Services, explain that the machines do not violate the European Union’s General Data Protection Regulation (GDPR) privacy legislation. Adaria stated that the machines do use facial image software but do not store or collect identification or verification information, like images.

Users and students grew concerned about their privacy and the security of their personal information while on-campus. The controversy resulted in campus goers demanding the university remove all 29 machines. University of Waterloo media relations representative, Rebecca Elming, confirmed the university requested the machines be removed and the existing technology disabled. Students and other members of the public also placed sticky notes and tape over the cameras on the machines.

The Office of the Privacy Commissioner of Canada (OPC) received two official complaints regarding this controversy. The OPC is observing the issue, but no official investigation has taken place.

Background
Facial recognition technology (FRT) is a biometric data application and artificial intelligence (AI) identification system, surpassing tools like finger or palm scans and retinal scans. Biometric data is the unique physical or behavioural characteristics that can identify an individual, such as DNA, fingerprints, faces, and voice patterns. FRT systems create a template using software that is unique to the organization or company employing it. This template is produced through a web overlay design using an individual’s facial structure from a scan of their face. The overlay template constructs a series of 0’s and 1’s which is enrolled in a facial recognition database. Any additional personal or sensitive information is tagged to this code, and can be encrypted.

When the system scans the digital image of an individual for identification, this new facial scan will create another code that is ran through the database using an algorithm and matched to that individual’s record. By mapping the individual’s features, the system compares likeness. Some reputable systems will not store an image with the template or be able to recreate a face from the code. The National Institute of Standards and Technology (NIST) publishes a public list of evaluated and graded systems to determine industry standards, accuracy levels, and deem systems as reputable. The NIST has run a voluntary Face Recognition Vendor Testing Program (FRVT) that assesses a system’s algorithmic ability to complete one-to-many and one-to-one verification. Technology developers submit their algorithms for evaluation in the FRVT program and can win awards, such as the Face Recognition Prize Challenge (FRPC). FRVT has tested the effects of face masks in recognition technology since the COVID-19 pandemic, as well as tested for demographic identification.

Facial Image Software
Not all systems that run biometric software are designed to recognize data. There are multiple general levels of this algorithmic software.

Facial Detection
Facial detection is the initial process of detecting a human face in an image presented to the system. The accuracy of detection depends on multiple factors, such as the amount of light or facial expressions. To improve accuracy, the system will perform pre-processing steps like the Viola-Jones detector, histogram of oriented gradient (HOG), or the principal component analysis (PCA). Beyond face identification in pictures, this algorithm can be used for video classification, determining objects, or region-of-interest detection.

Feature Extraction
This step in the biometric process extracts features of the face from images in the detection stage. In extraction, a signature of the face is created that analyzes the distribution of prominent features like the eyes, nose, and mouth. Methods of extraction include HOG, local binary pattern (LBP), independent component analysis (ICA), or scale-invariant feature transform (SIFT).

Facial Characterization
Similar to feature extraction, the system collects detailed information beyond facial data. This includes analyzing the background of an image, the individual’s gender or age, and facial expressions. Characterization systems do not detect personal information but can be used for audio-description or to give advertisers data from consumer movements.

Face Recognition
Face recognition uses identified features and characteristics to compare this information to a database. This matching system has two applications: verification and identification ref name=“FaceRec” />. Verification uses one-to-one matching which compares a known face scan to the pre-existing template in the database and either accepts or rejects the match. This system, for example, would be used to unlock a phone with a facial scan password. Identification uses one-to-many matching to identify an unknown face against a database by creating a template and matching it to the collected data. The one-to-many system is used by law enforcement. Some techniques used for face recognition are correlation filters (CFs) and convolutional neural network (CN.

Privacy Issues
FRT is considered controversial as this technology bridges the line between security and privacy, and poses ethical and legal implications. An involuntary image capture and analysis impacts the privacy rights of public citizens due to the infringement upon their ability to have a choice as to who can access their data and if it can be collected in the first place. As well, facial images and biometric data are personal information that individuals have the autonomy to control. The right to privacy of an individual’s personal information, or personal data, is protected under state-specific legislation, like the European Union’s General Data Protection Regulation (GDPR) or the Personal Information Protection and Electronic Documents Act (PIPEDA) in Canada.

FRT allows government and commercial organizations to participate in surveillance of the public sphere. Surveillance, or institutionalized surveillance, is ”the focused, systematic, and routine monitoring of behaviour, activities, or information that provides either tactical or strategic intelligence”. The major reasons provided by governments for the implementation of FRT is to act as a crime deterrent, increase citizen safety, and to construct “smart cities”. Yet FRT has the ability to enhance surveillance which “can interfere with privacy”. Often compared to the dystopian society of George Orwell’s novel Nineteen Eighty-Four, some Communication scholars view the collection of biometric data and FRT as a potential form of oppressive security and monitoring technology across networked systems.

Camera surveillance in Canada, as studied by Surveillance Cameras Awareness Network (SCAN), has emerged with computerization since the early 2000s. The SCAN study reported the implementation of open-street closed-circuit television (CCTV) in various Canadian cities. SCAN noted that, with the increase in cameras, visible signage posted along the perimeter of the surveilled area became the most, and usually only, essential way to give notice of camera surveillance as recommended or required by federal privacy policy in order to achieve “informed consent”.

Consent
Consent is the “compliance in or approval of what is done or proposed by another”. Individuals encounter the opportunity to consent daily through the prompt to accept or decline the “terms and conditions” when engaging with a website or app. Some acceptances of the “terms and conditions” or “cookies” online can result in the collection of an individual’s personal information, like their IP or activity tracking across other apps.

There are four types of consent: express or explicit consent, implied consent, opt-in consent, and opt-out consent. Express consent is when an individual authorizes the collection of their data through a verbal agreement or in writing. Implied consent occurs when an individual participates in a situation which can be considered them consenting to the collection of that data, like entering a surveilled area that posts signage notification. Opt-in consent is when an individual is asked for their permission before data is collected. Opt-out consent happens when an individual is able to decline their consent at any point during use of an application, product or collection.

In regards to personal information and its collection, privacy legislation and Communication or legal scholars identify ‘consent,’ specifically ‘informed consent’ or ‘meaningful consent’ as a step towards giving the public some authority over their data. Guidelines for obtaining meaningful consent is a report from the OPC published in May, 2018 that outlines “seven guiding principles” for organizations to achieve meaningful consent with their customers and reflects legislation within the Personal Information Protection and Electronic Documents Act (PIPEDA). PIPEDA mandates that, before consenting, individuals must understand the “nature, purpose and consequences” of their consent. To achieve meaningful consent, an organization has to inform the individuals using their services of the privacy practices in place in a digestible manner. This includes identifying the personal information that will be collected, to whom it will be shared, why it is being collected, and what potential harms exist. Another aspect of meaningful consent constitutes an organization giving individuals a significant ability to choose between ‘yes’ and ‘no’.

Privacy Legislation
The security of biometric data is a concern to both governmental and commercial spheres. Since the collection of personal information can lead to harm or consequences to members of the public, federal and municipal privacy laws aim to protect the public’s right to control their data. The OPC has standards and enforcements, including guidelines for organizations and government institutions, to mandate the provision of choice for ‘informed’ and ‘meaningful consent’ to users and conducts investigations into policy violations.

Freedom of Information and Protection of Privacy Act (FIPPA)
The Ontario Freedom of Information and Protection of Privacy Act (FIPPA) was executed on January 1, 1988 following Bill 34. Under FIPPA, personal information is considered any “recorded information about an identifiable individual”. Biometric data, such as an individual’s finger print, photograph, iris scan, or blood type is considered personal information which is protected. FIPPA privacy rules indicate the regulations around collection, use and notice when personal information is recorded by institutions.

Personal Information Protection and Electronic Documents Act (PIPEDA)
The Personal Information Protection and Electronic Documents Act (PIPEDA) came into effect on April 13, 2000. PIPEDA governs data privacy and the collection and usage of individuals’ personal information in private sector organizations. PIPEDA is set to be replaced by the Consumer Privacy Protection Act (CPPA) as introduced by the Digital Charter Implementation Act, 2022. The CPPA will introducer stronger privacy law and personal information protection, as well as enhance the public’s control over their data and ability to give meaningful consent. This legislation aims to balance commercial organization’s collection and usage of personal information with the rights of citizens.

General Data Protection Regulation (GDPR)
The European Union’s (EU) General Data Protection Regulation (GDPR) (Regulation (EU) 2016/679) was implemented on April 14, 2016 and put into action on May 25, 2018. These regulations aim to strengthen the rights and control of individuals over their personal data and to provide guidelines for international business. It is considered the strongest regulation for personal data management worldwide.

Article 4(1) of the GDPR defines personal data beyond the Article 2(a) definition of ‘data subject’ in the previous legislation, the Data Protection Directive (DPD). Personal data is considered information belonging to “a natural person who is identified or identifiable” . The GDPR definition includes examples of identifiers such as [[internet protocol (IP) and location data . One important EU mandate is that systems must be implemented when “privacy by design” (PbD) and “privacy by default” are created for the analysis of personal data . This analysis process includes whenever personal information is created, received, shared, or destroyed . PbD, or the protection of data through the design of the technology, outlines accountability for systems using FRT, including law enforcement.

Office of the Privacy Commissioner of Canada (OPC) Investigatory Standards and Purpose
The creation of the OPC in 1977 constitutes the first Canadian privacy law. The main responsibility of the OPC is to investigate privacy violation complaints from the public and reporting these issues to lawmakers. Bill C-43 was passed in 1983 which created legislation outside of the Canadian Human Rights Act: the Privacy Act (Canada) and the Access to Information Act. Complaints regarding privacy issues in government sectors can be made under the Privacy Act.

The Privacy Commissioner of Canada is an Agent of Parliament that conducts independent and impartial investigations of complaints under section 29 of the Privacy Act against federal government institutions. The Privacy Commissioner and OPC also conduct investigations into the handling of personal information by businesses under PIPEDA. Complaints from members of the public can include issues regarding the improper collection, use, disposal and protection of personal information. For business investigations, the OPC will publish a select number of case summaries and findings to reflect the application of PIPEDA in daily business functions using personal information.

Dispositions
Results of a complaint under the Privacy Act or PIPEDA regarding either federal government institutions or businesses are referred to as a disposition. As of April 1, 2020, there are eleven types of dispositions that a complaint can be assigned.

Well-founded
The reported “institution or organization contravened a provision of the Privacy Act or PIPEDA”.

Well-founded and resolved
The OPC are satisfied by the corrective measures applied by the institution or organization that contravened a provision of the Privacy Act or PIPEDA.

Not well-founded
The investigation concluded that no or insufficient evidence was discovered to establish that the institution or organization contravened a provision of the Privacy Act or PIPEDA.

Resolved
The result of the investigation under the Privacy Act either a) found that the complaint was due to miscommunication or a misunderstanding between parties, or b) the OPC requires actions taken by the institution to solve the issue.

Settled
Over the course of the investigation, the OPC assisted in negotiating a satisfactory solution between parties and did not result in a finding.

Discontinued
Prior to an inquiry into every allegation under the Privacy Act, the OPC terminates the investigation. This can be due to the complainant either losing interest in continuing an investigation or they cannot be located for further information in order to achieve a result. Under PIPEDA, this occurs when an investigation is terminated by the Commissioner under subsection 12.2(1) without a finding being reported.

No jurisdiction
No report will be issued as the institution or organization or subject of the complaint did not apply to federal privacy legislation.

Early resolution (ER)
Prior to the OPC issuing a finding, the matter is satisfactorily resolved early in the process in the interest of the complainant.

Declined to investigate
The Commissioner declined to begin an investigation into a complaint under PIPEDA. The Commissioner may find that the complainant had other procedures to attempt before an investigation, that the complaint could be investigated under other Canadian laws, or the filing of the complaint was out of the time period of the issue according to subsection 12(1).

Withdrawn
No report is issued under PIPEDA as the complainant “voluntarily withdrew the complaint” or was unable to be reached by the OPC.

History
The University of Waterloo vending machine facial recognition system error controversy began on February 10, 2024. The main screen of the M&M’s-branded vending machine, owned by Invenda Group and Mars Inc., displayed the error code: “Invenda.Vending.FacialRecognition.App.exe — Application Error.“ Reddit user SquidKid47 shared a photo of the error message to the page, r/uwaterloo. The photo was captioned: “hey so why do the stupid m&m machines have facial recognition?”

The post received 1.1 thousand upvotes and over 200 comments. Many users and students voiced concern that this FRT was not being disclosed and the public was not being notified while on the University of Waterloo campus. Fourth-year student River Stanley investigated the machines in volume 154, issue 3 of the student publication, mathNEWS. Global and local news organizations also reported on the controversy. The media attention and concern from campus goers resulted in a demand for all 29 machines to be removed from campus. As of March, 2024, University of Waterloo media relations representative, Rebecca Elming, confirmed that the machines have been unplugged and a request for removal has been sent to Invenda Group. Stanley contacted Invenda Group and Adaria Vending Services regarding the issue.

Invenda Group
Invenda Group is a Swedish company based in Alpnach that produces “smart” vending machines. These “intelligent” machines operate under the “Invenda ecosystem” which allows owners to update the system and advertisements on the choice screen remotely. Invenda Group customers include Mars Inc., Selecta, and Valora. Their technology partners include Microsoft, Nayax, and Burroughs, Inc.

In partnership with Mars Inc. and Adaria Vending Services, Invenda provided “smart” vending machines on the University of Waterloo campus equipped with facial image software. Reddit users noted that a visible hole for the camera was found on the front of the machine beside the choice screen.

Adaria Vending Services
Adaria is a Canadian company that specializes in retail and vending technology. Adaria customers include Amazon, Air Canada, and Conestoga College. Adaria manages fulfilment services for the Invenda machines at the university, including product restocks and logistics.

A statement from Adaria to Stanley stated that the “smart” vending machines did not violate GDPR regulations. Invenda Group, based in the EU, is adamant they follow the EU legislation on biometric data collection and FRT guidelines. According to this source, the vending machines do not use software for face recognition as there are no identification or verification applications, but simply detection software. The camera acts as a motion sensor whereby it scans for a customer and activates the display of choices on the screen for user selection.

OPC Complaints
Two complaints about the vending machines on the university campus have been made to the OPC. The OPC is reviewing the issue, labeled “concerning” by spokesperson Vito Pilieci. A full investigation into the matter has not been confirmed.