Behavioral analytics and social media analytics technology is outpacing the policies and politics of hoodies and behavioral profiling. This is the long standing issue of what is IT’s role in administering acceptable use, crime prevention and the use of technology to enforce it. This issue began with IT departments being called upon to discern what is porn in their role of enforcing AUP but has evolved with the emergence of social media analytics and network based behavioral analytics tools to now detect and alert on suspicious activity.
Video Analytics and Hoodies
The whole reason for writing this post today is the College Planning and Management story Keeping Watch in their March edition. The article struck a chord with me that institutional policies and IT governance are likely not equipped to properly employ the use of video analytics.
My gut feeling is there needs to be independent oversight of the ‘rules engines’ for behavioral definitions and pattern matching if schools are to avoid liabilities from profiling or other abuses.
I also had a gut reaction to the article’s use of two images showing ‘bad men’ in hoodies. I know my reaction is due to the heightened awareness of the stigma surrounding hoodies, and I realize a few years ago they would have used men wearing ski masks or pantyhose.
But that illustrates my point on the arbitrary and changing nature of suspicion and what ‘bad’ people look like. Will your campus safety officer decide to profile hoodie wearing men that loiter too long outside a dorm? What about other sets of attributes and behaviors?
Who decides what to alert on, record and track? How does the administration achieve governance to detect or prevent abuses which might expose the institution to litigation and liabilities?
Behavioral Analytics and Social Media Analytics
The need for risk management, controls and governance is not limited to video analytics. Traditional network tools have already moved well beyond rudimentary port-blocking, user ID based awareness, or reputation ratings of web sites for AUP or HEOA P2P compliance.
Today’s tools go beyond controlling access to sites like Facebook by further allowing control over applications and features within Facebook and as well as user actions and device types.
That’s not all. IT departments can now use a wide variety of social media analytics and network based behavioral analytics tools to gain even deeper understanding of user actions to detect various policy violations on campus and off. But should they?
Who decides what sites are OK? Who decides what’s art, porn or clinical? Who audits those decisions, receives the exceptions reports and investigates alerted behaviors? Who can ask for a person specific history when it is an employee or a student?
What is IT’s Role
CIO’s and IT managers are no more equipped for this role than any other manager, perhaps less so than HR. For some reason HR departments are reluctant to take ownership of acceptable use as part of ethics or code of conduct policy administration because of the the technology.
I should say they don’t want to enforce AUP until it gets to the point of action for violations even though most AUP policies designate enforcement to the CIO or CISO.
Let me see if I can explain this with some illustrations. HR departments easily accept ownership of the sexual harassment policy including training and enforcement. But they stop short of using technology to proactively look for evidence of a hostile work environment in institutional computer systems and networks even though this is very doable. In contrast Legal may actively look for data leakage or theft by employees.
Somewhere in the enforcement of ethics policies or code of conduct policies, for employees and students, there is a line. That line exists where the technology allows for proactive prevention and detection of policy violations. Oh and that line is constantly moving and very subjective.
The reluctance of HR departments and non-IT mangers to cross the line exists because the technology is so powerful today that it surprises HR departments just how many violations there are and how many ‘good people’ are involved.
The technology is so powerful it often forces an administration to question their zero-tolerance policies when it conflicts with their instincts or heart.
The reluctance is also a reaction to the ideas of becoming big brother and the enforcer. HR knows instinctively it just doesn’t feel right so the burden often gets pushed to IT to do the dirty work and take the heat.
Segregation of Duties
This could be a controversial and complex issue made worse if left soley to the IT department or campus safety office. So it might be wise to consult with your external auditors or even legal counsel before becoming exposed.
Like so many other areas or risk, behavioral analytics warrants inclusion in your enterprise risk management (ERM) program. That would include establishing sufficient controls (administrative and IT controls) over all configurable tools which perform filtering, blocking, pattern matching in the areas of behavioral analytics.
Additionally, I would encourage following standard control strategies beginning with appropriate segregation of duties (SOD) in order to ensure sufficient independence in the policy decisions and the configuration, operation and oversight functions.
Technology advancements like video analytics are certainly powerful. Which only means we need to exercise additional but reasonable care so the good outweighs the bad.