Adam Vincent

Adam is a member of the Layer 7 Technologies Advisory Board – Public Sector for Layer 7 Technology and considered a trusted subject matter expert to the Department of Defense (DoD) and Intelligence Community (IC) in their goal of secure net-centric enablement and provided guidance in the government’s goal of sharing information more seamlessly.

September 15th, 2010

Hacking as a Service (HaaS)

Written by
Category Uncategorized
 
On Monday this week there was a very interesting post by Andy Greenberg a blog writer for Forbes.com which introduces a botnet herd standing by for payment and targeting instructions to launch a powerful Distributed Denial of Service (DDoS) attack. It appears based on his research that the botherd called "I'm DDOS" and available at "IMDDOS.org" is supposed to be used for testing purposes, however it is not clear how any type of target validation would or could be done by the company running the service to validate the target belongs to the attacker. You can see from the User Interface (UI) that the service looks to be fairly easy to use making it a likely attack tool for anyone with minimal computer skills and a grudge.

As with pioneers in computer infrastructure as a service, such as Salesforce and Amazon’s EC2 cloud, cyber arms dealers have begun asking customers, “Why buy when you can rent?” Renting cyber attack capabilities allows a political activist, terrorist group, or nation state to launch an attack on an online application - on demand. Those familiar with Cloud Computing and Software as a Service should recognize this as being the malicious equivalent - "hacking as a service".
It is clear that the "?? as a Service" model is going to be popular for people wanting to bring their products to market quickly and for those that want to see results with minimal up front capital costs.




July 7th, 2010

Letter to the President on Cyber Security

Written by
Category Uncategorized
 
The United States Senate sent President Obama a letter on July 1st. The letter spoke of the criticality of securing our nation's information systems, communications networks, and critical infrastructure, and states that there is an urgent need for action to address the vulnerabilities. Action in the text is largely comprised of policy, and coordination, however it does state the need to improve and expand the U.S. cyber workforce and increase cyber threat awareness throughout the country.

This letter is a prelude to a number of highly sensitive pieces of legislation that the President will need to comment on in the coming weeks and months and likely means that Senate thinks that there may be opposition to their legislation.

Here is the link:
http://www.nationaljournal.com/congressdaily/issues/documents/Letter_President_on_Cyber_Security_Legislation_070110.pdf





June 11th, 2010

Federated Service Monitoring

Written by
Category Uncategorized
 

What is Federated Monitoring?

A wise man once told me that there is a big difference between reachability and availability. Ever since I have been fascinated by the challenges that we face with net-centric information sharing and service dependencies which cross all forms of organizational, network, and even classification boundary. The reality here is that with net-centric approaches and the need to re-use services, we will have massive dependencies on services outside of our control.

The Federal Government has emphasized and even mandated in some cases the use of XML, Web Services, and SOA concepts and standards to align IT assets with business processes to employ the concept of netcentricity. Simply put the concept of netcentricity makes the right information available at the right time to the right people.

By exposing applications as reusable and dynamically composable services, new business processes can be defined on-demand to allow for business agility. This is especially important as Government organizations are constantly defining and building solutions for an evolving set of requirements many of which are based on a near term objectives to offer a set of capabilities to the war-fighter or analyst supporting an immediate threat.

The reality here is that these services will be stood up and offered throughout the government enterprise and will cross organizational, network, and potentially even classification boundaries. These newly formed IT Communities of Interest (CoI) will require a shared knowledge of their individual and collective purpose, mission objectives, service level agreements, security postures, and availability and reachability characteristics.

Existing monitoring approaches and products are based on the perspective of internal monitoring and portraying network, application, and service visibility. Within the DoD and IC the definition of enterprise is often not clear, and visibility and monitoring is segmented based on project, department, organization, branch of service, etc.. In Government, we are integrating our services across these different mini-enterprises and are lacking in an ability to monitor services in a federated fashion. Since netcentricity is all about services, I assert that we don't care that much about the health and availability of a server or an application, unless it impacts the service that we are using, and therefore my focus is on Federated Service Monitoring.

Federated service monitoring portrays the service availability information as it relates to usage of the service external to the enterprise. Availability in this case is measured not only by the internal services status, but additionally by aspects of the service provider's network. This end-to-end reachability information must be portrayed outside the enterprise in a secure fashion and made available to those wishing to use the service. With federated monitoring service implementers can extend their internal monitoring external to the organization to allow for business partners to accurately measure services availability, reachability, and performance in an ongoing fashion.

The Department of Defense (DoD) and Intelligence Community (IC) has developed the Joint DoD/IC Enterprise Service Monitoring (JESM) Specification, which in time will be used across the govenrment as a way of doing secure federated monitoring. The JESM specification is based on a subset of WSDM relevant to DoD/IC use-cases and WS-Eventing.

Layer 7 Technologies (www.layer7tech.com) SecureSpan and CloudSpan line of products are fully supportive of the Joint DoD/IC ESM specification. For every service within Layer 7, JESM monitoring can be enabled for external consumption of service metrics. The JESM Service supports request/response or publish/subscribe and for each JESM enabled service (Mission App A-C, etc.) and policy can be enforced to ensure access-control, confidentiality, integrity, and audit of JESM data. For example, Mission Application A metrics can be made available, but access limited by the attributes of the authenticated subject whom is requesting them.

In my time working with government, I have seen numberous occations where a service went down and noone knew for several days, all while they believed the data coming from the service was still available.

Mission IT visibility (past, current, and future) and operational flexibility (in the face of attack or even power failure) is critical. Federated monitoring isn't a silver bullet, however I believe it will be helpful in allowing for communities of interest to come together quickly, integrate their IT, while providing visibility and react-ability in the case of failure.


March 3rd, 2010

CNCI Partially Unclassified.

Written by
Category Uncategorized
 

In the wake of CNN's airing of the two-hour special, We Were Warned: Cyber Shockwave, which stunned much of the public, and this governments administration, Mr. Howard Schmidt, the Executive Branch Cybersecurity Coordinator, or Cyber Czar, gave a keynote speech at the RSA conference yesterday.

Although I was stuck in DC, I read online that he spoke of Partnership and Transparency as being critical components in the President's Cyberspace Policy Review. Mr. Schmidt also announced that the Obama Administration had revised the classification guidance for the Comprehensive National Cybersecurity Initiative (CNCI) and that the unclassified portion would be made available by the end of day on Tuesday on the whitehouse.gov website. CNCI was launched by President Bush in National Security Presidential Directive 54/Homeland Security Presidential Directive 23 (NSPD-54/HSPD-23 ) in January 2008. Although I'm excited about the potential of Partnership and Transparency as I too believe we can not be successful in cyber space without them, I was more interested in the public release of CNCI, which illustrates some direction for our government in the shorter term.

Announcement of a funded strategy, even if it's not the Administrations promised National U.S. Cybersecurity strategy, couldn't have come at a better time. With the recent CNN broadcast, and the successful attacks in the last few months on Google, and Twitter, consensus is that the US is not ready for a sophisticated cyber attack that crosses Government, critical infrastructure, and private domains.

If you watched the CNN special, you would have seen that the question wasn't whether we could defend ourselves from a nation state or hacker group or launch an attack across cyber space, but rather could we act quickly enough to an attack while being impeded by questions of law, policy, politics, and jurisdictional boundary - issues not shared with our fast moving, highly sophisticated adversaries.

CNCI has the following major goals:

  • To establish a front line of defense against today's immediate threats
  • To defend against the full spectrum of threats
  • To strengthen the future cybersecurity environment

Layer 7 Technologies a Vendor of dynamic cyber defense products, provides our customers with the ability to protect applications, enable application monitoring for situational awareness, and the ability to adapt in the face of attack. These capabilities are critical in providing solutions across all three of the major goals of CNCI.

For more information on CNCI, please see www.whitehouse.gov

For more information on Layer 7 Technologies, please see www.layer7tech.com

February 11th, 2010

Identity and Access Management in Cloud Computing: Part 2

Written by
Category Uncategorized
 

Cloud Computing Implementation Options and Challenges

Like any traditional IT project, a project leveraging cloud computing must first look to its requirements. Most IT projects have some requirement for identity whether it be that all accesses to the cloud or just administrative accesses require Authentication and Authorization. This second blog post in the series titled "Identity and Access Management in Cloud Computing" is focused on the implementation challenges of Identity and Access Control Architectures as they relate to cloud computing.

Identities for cloud computing can be broken down into the following categories:

  • Enterprise - Enterprise Users, and applications that will access cloud applications
  • Internet - Customers, Partners, and Unanticipated Users that will access cloud applications
  • Cloud - Cloud applications that will access cloud, enterprise, and partner applications

Whether we are talking about cloud usage, or cloud administration, identities can be binned into one of these three categories. The following paragraphs focus on the options and challenges in implementing an identity and access control architecture for cloud computing.

Identity Management - Identities may be associated with human resources hiring and firing, new or changing partner and contractor relationships, or new servers or applications being setup. Processes may include identity creation and role/group addition, credential issuance, audit and compliance, and on-going management and eventual deletion. Most companies leverage products which govern the creation of identities within their enterprise in accordance with their particular compliance regulations.

There are two approaches to identity management in cloud computing:

  • Leverage existing enterprise identity management system for cloud identities
  • Utilize a new cloud based identity management system and process for cloud identities
Identity Management in the cloud through either an integral cloud provided identity system or a cloud deployed identity management system fails in a number of ways. Below are the issues that come to mind:

User Experience

  • Separate systems increases user frustration

  • Users having more than a single credential can be problematic

  • Users have to deal with two separate processes for identity creation

  • Users may potentially become confused with enterprise vs. cloud issues and or policies

Manageability

  • Administration of identities requires double the amount of administration

  • User attributes are not automatically populated in cloud-based systems

Compliance and Risk

  • Cloud-based systems must adhere to regulatory requirements for identity provisioning

  • Cloud-based systems can easily be overlooked when changes are made to enterprise User's identities and privileges

  • Cloud-based systems may be susceptible to internet breach

Cost

  • Double the amount of work required to administer users
  • Purchasing and fielding identity products to the cloud may be costly
  • Separate Audit and Compliance may requires significant investments

Therefore, we must look to our existing enterprise identity management capabilities for managing identities for cloud usage, and administration.

Authentication Services - Principals are authenticated based on the principal making a claim regarding its identity, and then providing proof that the claim is true. For example, in computer systems, the username claims the principal's identity while the password which is a shared secret between the user and the system with which they are authenticating is the proof.

Authentication Services are responsible for authenticating principal's based on the principal making a claim regarding its identity, and validating that the claim is true. An Authentication Service provides a single logical component of a IT architecture where authentication may be accomplished. LDAP is a typical Authentication Service in that it provides a single point where users can be validated against their claims, whether their claim be in the form of a password, a certificate, or a stronger form of credential.

Identities and claims are managed and stored within the enterprise today and investments have already been made in this area. Authentication in the cloud requires user identities and claims to be available to the cloud applications. There are four approaches to this that will be discussed:

New cloud based solution
  • Many of the same issues encountered in moving identity management to the cloud are encountered with this approach.
  • Possible breach and release of identities to the internet
  • Administrative burden in managing two systems
Connectivity to the enterprise
  • For security reasons LDAP, and enterprise identity repositories are not accessible from the internet and thus would not be available to the cloud applications.
  • If they were available, latency of authentication queries may be a significant issue.
Identity replication from enterprise to cloud
  • All enterprise users information stored in the cloud poses a security and privacy problem should the cloud based identity repository be breached from the internet.
Federation of enterprise identity system
  • This approach carries the most opportunity for success as identity repositories can remain within the protected interior of enterprise. An externally available Secure Token Service (STS) could allow authentication and issuance of a federated authentication token to be utilized for authenticating to the cloud.

Federation of enterprise identity systems will be described in a future blog posting. This is the basis for allowing Identity Management Systems and Authentication Services to remain within the enterprise.

Authorization Services - Authorization is the means for ensuring that only properly authorized principals are able to access resources within a system. A Principal can either be a human, machine, or an application. In order to carry out authorization, the first step is to authenticate the principal, the second step is to obtain information about the principal and resource to which the principal is interacting and the final step is to allow or deny access to the principal based on the applicable policies for that resource.

An Authorization Service is responsible for evaluating an authorization query, collecting necessary information about the principal and the resource, potentially from an Attribute Service and/or identity directory, and evaluating a policy to determine if access should be granted or denied. There are three approaches where an authorization policy may be enforced in cloud computing.

Enterprise Authorization - The Cloud application asks the enterprise to make an authorization decision to grant or deny access.

  • Policies are created, managed, and stored within the enterprise
  • Authorization Services must be available to the internet which raises potential security issues of man in the middle and denial of service impacting cloud application usage
  • Latency may be an issue as cloud resources depend on network calls to enterprise for access

Stand Alone Cloud Authorization - Usage of cloud provided or custom authorization services to grant or deny access

  • Policies are created, managed, and stored in the cloud
  • Requires separate administration of cloud-based system
  • Course-grained capabilities of cloud-provided solutions may not suffice
  • Compliance and regulatory requirements may not be met by cloud provided systems

Cloud Authorization with Enterprise Governance- The cloud makes an authorization decision but policies are governed by the enterprise

  • Policies are created, managed, and stored in the enterprise but cached in the cloud
  • Allows policies to be created and managed in accordance with enterprise processes
  • Allows faster response times as authorization services are available local to the cloud applications

For these reasons, the most robust mechanism for cloud authorization is to deploy an authorization service in the cloud which can retrieve authorization policies from the enterprise. This will be a topic of a future blog posting. Specifically, standards will be discussed which make it possible for cloud-based authorization services to retrieve polices from the enterprise in a secure fashion.

Conclusions

Organizations must extend their existing Identity and Access Management Strategies into the Cloud. New solutions for the cloud simply will not scale rather the cloud must be seen as part of the "extended" enterprise, whereas existing privacy concerns, compliance issues, and processes and controls are dealt with within the cloud using strategies and solutions already built and utilized within the enterprise. In future blog postings, I plan to discuss ways that the enterprise can extend its existing solutions for Authentication and Authorization Services to the cloud.