Privacy Daily is a service of Warren Communications News.

Mass. Senate Panel Clears Bill on Employer Surveillance, Automated Decisions

A Massachusetts bill about employers’ use of electronic monitoring and automated decisions advanced in the state Senate this week. The Internet Committee approved and sent S-35 to the Ways and Means Committee on Thursday. The panel also cleared a social media accountability bill (see 2510160046).

Sign up for a free preview to unlock the rest of this article

Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.

The Internet Committee heard testimony last month on S-35, also known as the Fostering Artificial Intelligence Responsibility (FAIR) Act, which was cross-filed with H-77 in the House. Labor groups supported the bill, while tech industry associations raised concerns during the hearing (see 2509110084).

The legislation would regulate employers’ use of electronic monitoring tools and the data they collect. “An employer shall not rely primarily on employee data collected through electronic monitoring when making hiring, promotion, disciplinary decisions up to and including termination, or compensation decisions,” says the bill. Also, the bill would require disclosures when “an employer makes a hiring, promotion, termination, disciplinary or compensation decision based in whole or part on data gathered through the use of electronic monitoring.”

In addition, employers would have to conduct impact assessments before using “electronic monitoring, alone or in conjunction with an automated employment decision system.” The bill adds that “it shall be unlawful for an employer to use an automated employment decision tool for an employment decision, alone or in conjunction with electronic monitoring, unless such tool has been the subject of an impact assessment.”

The FAIR Act includes a private right of action and would require the attorney general to make regulations to implement the proposed law.

“Such regulations shall consider … bias testing, appropriate disclosures, clear, conspicuous, and reasonably understandable notice, whether there exists a client-professional relationship, best and current practices and models utilized by other states and the federal government to ensure regulations are responsive to emerging technologies, and appropriate additional documentation that is reasonably necessary to assist the Office to evaluate the inputs and outputs and monitor the performance of artificial intelligence and automated decision-making systems for the risk of bias and consumer harm.”