The open source community, including Hortonworks, has invested heavily in building enterprise grade security for Apache Hadoop. These efforts include Apache Knox for perimeter security, Kerberos for strong authentication and the recently announced Apache Argus incubator that brings a central administration framework for authorization and auditing.
In multi-platform environments with data coming from many different sources, personally identifiable information, credit card numbers, and intellectual property can land in the Hadoop cluster. The question becomes: how to keep all this sensitive data secure, as it moves into Hadoop, as it is stored, and as it moves beyond Hadoop?
Join Hortonworks and Voltage Security to learn about comprehensive security in Apache Hadoop, and more:
Apache Argus: a central policy administration framework across security requirements for authentication, authorization, auditing, and data protection;
Data-centric protection technologies that easily integrate with Hive, Sqoop, MapReduce and other interfaces;
How to avoid the risks of cyber-attack and leaking of sensitive customer data; and
Ways to maintain the value of data for analytics, even in its protected form.
If you are enabling the Modern Data Architecture with Hadoop, protection of sensitive data is an area of security where enterprises need a cross platform solution. We invite you to learn more about our joint approach at our webinar on Wednesday, August 27. Register here: http://hortonworks.com/blog/securing-hadoop-options/