In multi-platform environments with data coming from many different sources, personally identifiable information, credit card numbers, and intellectual property can land in the Hadoop cluster. The question becomes: how to keep all this sensitive data secure, as it moves into Hadoop, as it is stored, and as it moves beyond Hadoop?
Join Hortonworks and Voltage Security to learn about comprehensive security in Apache Hadoop, and more:
Apache Argus: a central policy administration framework across security requirements for authentication, authorization, auditing, and data protection;
Data-centric protection technologies that easily integrate with Hive, Sqoop, MapReduce and other interfaces;
How to avoid the risks of cyber-attack and leaking of sensitive customer data; and
Ways to maintain the value of data for analytics, even in its protected form.
If you are enabling the Modern Data Architecture with Hadoop, protection of sensitive data is an area of security where enterprises need a cross platform solution. We invite you to learn more about our joint approach at our webinar on Wednesday, August 27. Register here: http://hortonworks.com/blog/securing-hadoop-options/