Time filter

Source Type

Seattle, WA, United States

Agency: Cordis | Branch: H2020 | Program: CSA | Phase: ICT-35-2014 | Award Amount: 499.88K | Year: 2014

The PICSE Procurers Platform will give access to a unique repository of information supporting the move from outright purchase to pay-per-usage made possible by the arrival of cloud computing. It builds on the Helix Nebula collaboration between supply and demand of which the three PICSE partners are key members, including the H-N coordinator, CERN. It addresses the fragmented landscape of inconsistent technical approaches and disjointed managerial structures that prevent delivery of a production-quality cloud computing e-infrastructure. PICSE will engage with cloud service providers, their customers and procurement professionals over a crucial period as Europes Cloud Strategy comes to fruition and several large multinational procurements (including PPIs and PCPs) take place. The project will provide a focal point avoiding duplication of efforts to identify, analyse, publicise and harmonise opportunities for shared procurement, including a direct response to the ECP Trusted Cloud Europe science use case, addressing cross-border procurement. PICSE will resolve key financial and legal constraints impacting business development and procurement and provide a range of best practices that address those barriers from both private and public sectors, including the research domain, in and beyond Europe. It will set out a realistic roadmap of future procurement based on the levels of ambition for adoption of cloud services over the next five years. This reflects the European Cloud Computing Strategy which calls for a framework of standards to give procurers confidence that they have met their compliance obligations and that they are getting an appropriate solution to meet their needs.

Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2011.1.4 | Award Amount: 4.22M | Year: 2012

Cloud technology offers a powerful approach to the provision of infrastructure, platform and software services without incurring the considerable costs of owning, operating and maintaining the computational infrastructures required for this purpose.\nDespite its appeal from a cost perspective, cloud technology still raises concerns regarding the security, privacy, governance and compliance of the data and software services offered through it. Such concerns arise from the difficulty to guarantee security properties of the different types of services available through clouds. Service providers are reluctant to take full responsibility of the security of their services once the services are uploaded and offered through a cloud. Also, cloud suppliers have historically refrained from accepting liability for security leak.. This reluctance stems from the fact that the provision and security of a cloud service is sensitive to changes due to cloud operation, as well as to potential interference between the features and behavior of all the inter-dependent services in all layers of the cloud stack. Still many cloud users, including institutional ones, would like to rely on cloud-based services they use to exhibit certified security properties.\nCUMULUS will address these limitations by developing an integrated framework of models, processes and tools supporting the certification of security properties of infrastructure (IaaS), platform (PaaS) and software application layer (SaaS) services in cloud. CUMULUS framework will bring service users, service providers and cloud suppliers to work together with certification authorities in order to ensure security certificate validity in the ever-changing cloud environment.\nCUMULUS will rely on multiple types of evidence regarding security, including service testing and monitoring data and trusted computing proofs, and based on models for hybrid, incremental and multi-layer security certification. Whenever possible, evidence gathering will build upon existing standards and practices (e.g., interaction protocols, representation schemes etc.) regarding the provision of information for the assessment of security in clouds.\n\nTo ensure large-scale industrial applicability, the CUMULUS framework will be evaluated in reference to cloud application scenarios in some key industrial domains, namely Smart Cities and eHealth services and applications,.\nCUMULUS is aligned with the recommendations of a recent industrial consultation to the European Commission which identified cloud certification as an enabling technology for building trust for end users through the deployment of standards and certification schemes relevant to cloud solutions, and included it in the ten key recommendations and actions for a cloud strategy in Europe.

Agency: Cordis | Branch: FP7 | Program: CSA | Phase: ICT-2013.1.2 | Award Amount: 907.12K | Year: 2013

CloudWatch will ensure high visibility of European R&D cloud initiatives driving interoperable solutions & services. CloudWatch, in 24 months, will accelerate and increase the use of cloud computing across the public and private sectors in Europe and strengthen collaborative, international dialogue on interoperability and portability. Three Concertation Meetings will support organisations, fostering multi-stakeholder dialogue and cross-fertilisation on best practices. CloudWatchHUB.eu will raise awareness of the benefits to major stakeholder groups: enterprises, especially SMEs; governments and public authorities; research and education institutions. Drawing on key issues, disseminating best practices on model contract terms, fostering a multi-stakeholder dialogue and facilitating the emergence and use of standard contracts. CloudWatch will make an active contribution to standards and certification, driving interoperability as critical to broadening choice and boosting innovation. It will provide a portfolio of EU and international use cases that demonstrate interoperability, portability and reversibility. The use cases will cover technical requirements, policy and legal requirements, such as SLA management. The use cases will lead to the development of common standards profiles and testing around the federation of cloud services. CloudWatch will also support efforts around certification and compliance testing.Support and guide to SMEs is key on the relevance, maturity and timely implementation of standards, and drive industry consensus. It will therefore contribute to increasing trust and addressing some of the biggest obstacles to uptake. CloudWatch is guided by Business Innovation & Global Interoperability Experts, supporting its strategic goals and helping to ensure the long-term sustainability of CloudWatchHUB.eu.

News Article | November 3, 2015
Site: thenextweb.com

The Snowden effect has caused the European Court of Justice (ECJ) to strike down Safe Harbor. This 15-year-old data transfer agreement between the EU and the U.S. allowed multinationals to store Europeans’ data in the U.S. if the companies agreed to comply with Europe’s data privacy laws. This turn of events certainly causes operational angst for thousands of U.S. businesses that store data overseas. Tightening data privacy regulations carry potentially dire consequences for businesses that can’t quickly adapt. In particular, the Safe Harbor ruling puts Cloud Service Providers (CSPs) in a tough spot as many of them depend on the framework, or closely related approaches, to do business in Europe as it acts as the mechanism to authorize them to store data on behalf of European companies. This ruling will have a large impact on some corporation’s investment focus and their financial performance. For example, companies like CSPs may need to build new data centers in countries in which data must now reside, and in the meantime, it will impact their ability to sell services to entire regions if their lack of a local presence precludes them from complying with data privacy regulations. As organizations aggressively push cloud adoption, it’s a given that more sensitive and regulated data is ending up in the hands of outside service providers and solutions like SaaS application systems. Organizations need actionable advice for instituting proactive means and mechanisms to ensure data privacy and regulatory compliance while they run the business – a significant piece of guidance that is lacking from the Safe Harbor legislation. As a starting point, here are five tips for companies to control cloud data and access in light of the Safe Harbor ruling and evolving regulatory landscape: Today, almost every CIO is dealing with a significant growth around Shadow IT. Shadow IT is hardware or software within an enterprise that was not procured through sanctioned approaches and is not supported by the organization’s central IT department. The phrase often carries a negative connotation because it correctly implies that the IT department has not approved the technology or doesn’t even know that employees are using it. One of the biggest areas this impacts is cloud application adoption (apps like Box, Evernote, etc.). The reality is that if CIOs aren’t seeing much Shadow IT cloud use within their organization, it usually means that they are not looking for it or are looking in the wrong place. The growth of Shadow IT cloud adoption is a result of current changes in workplace culture, technology and the nature of modern day work. For example, the organizational demand from employees to access more applications to do their jobs is outstripping many IT teams’ capacity to meet the requirements. At the same time, business users feel that it has become too complex to source business applications through traditional IT processes. The digital transformation occurring in our society also plays a role in the rise of Shadow IT as business users are drawn to new apps and many of the business stakeholders have a ‘let’s just do it’ mentality. In order to ensure this form of enterprise cloud adoption is done in a controlled way, IT teams need to leverage the proper tools, such as a new category of solutions called Cloud Access Security Broker (CASB) offered by a selection of technology providers, to gain visibility into what Shadow IT is being used. Gartner has written extensively on the importance of these CASB solutions for enterprises and have recently predicted that by year-end 2018, 50 percent of organizations with more than 2,500 employees will use a CASB product to control Cloud SaaS usage. Once they have this insight, they can then put the proper tools in place that not only protect the business but also allow employees to continue working in a secure and compliant way in order to drive business results. Tokenization is a process by which sensitive data fields, such as a patient’s medical information, are replaced with surrogate values called tokens. De-tokenization is the reverse process of redeeming a token for its associated original value. While various approaches for creating tokens exist, frequently they are simply randomly generated values that have no mathematical relation to the original data field. This underlies the security of the approach – it is nearly impossible to determine the original value of a sensitive data field by only knowing the surrogate token value. Since tokens have no mathematical relationship tying them back to the original clear text sensitive data, there is no possibility of back doors/trap doors. Because tokens, unlike encrypted values, cannot be reversed back to their original values through the use of a cipher algorithm and a key, tokenization is a popular approach to solve market requirements associated with data residency, which are regulations specifying that certain data types need to remain within a defined geographic border.  Data residency is a similar issue to the EJC ruling on Safe Harbor, which is why many are looking to it as a potential solution to the Safe Harbor issue. For businesses operating in Europe, leveraging a CSP’s local EU datacenter makes a lot of sense, however there are a few things to take into consideration. For example, enterprises need to determine where the data will reside – within the broader EU, within a specific country or within a specific state/province/region of a specific country. Knowing this information is critical when selecting the right data center location to store data. Since CSP’s often maintain the right to move data between datacenters, it is important to understand whether the CSP maintains this contractual right. Just because an enterprise’s primary data center is in the region or country they need it to be in, does not mean the back-up one is. It could be in another country or region all together. In addition, enterprises should have a clear understanding about whether or not other cloud apps will “dip into” their primary cloud’s datacenter. For example, a cloud that does complicated product pricing and quoting estimates may access data from another cloud service that manages customer information in order to perform its function. Regardless of where the cloud provider’s datacenter sits, executives may also need to be concerned about the location and/or citizenship of the CSP’s employees that have access to the data.  Some vertical business sectors, like defense-related manufacturing, frequently have additional restrictions placed on them about the citizenship of individuals that can access data. If the cloud provider has employees located in various countries around the world that have access to data for routine maintenance and data hygiene purposes, make sure to understand data requirements carefully to avoid compliance and audit flags down the road. The regulatory and data privacy landscape will continue to change, so future proofing IT and cloud infrastructure allows for flexibility to quickly adapt to evolving regulations. For example, enterprises can make sure they take steps to share data in an anonymized fashion and still make it usable within the business. By parsing, anonymizing and encrypting data, insights can be gained and actions taken without violating individual privacy. The Safe Harbor ruling in October is a perfect example of how the shifting sands of changing regulations work.  Security and Compliance professionals went to bed on October 5th knowing they were in compliance, and then found themselves with a compliance problem by the afternoon of October 6th.  And even since then, the sands have shifted even more. Many CSP’s have pivoted to using Model Clause language as a means to show compliance with EU data privacy requirements, but many Compliance and Advisory firms, and now some Data Privacy Authorities in Europe, have weighed in since saying that the same issues cited in the ECJ’s ruling as the reasons why Safe Harbor was deemed inadequate also plague the Model Clause approach. A quick search on the message boards of the International Association of Privacy Professionals (IAPP) reveals the extent of the issue and the depth of the challenge. Instead of solely relying on mechanism like the Model Clause, savvy executives are finding smart ways to use technologies to ensure that data deemed sensitive or regulated stays within their IT environment whenever possible. If it needs to be shared with third parties – like CSP’s – they can anonymize it via tokenization or encryption to make sure only trusted authorized parties have the means to bring any sensitive or private information back into the clear. Analyst firms like Gartner and associations like the Cloud Security Alliance have been clear in their best practice guidelines on securing data.  For example, if enterprises are going to encrypt information, they need to use strong, well-vetted algorithms and ensure that their enterprise retains sole physical ownership of the encryption keys. The same concept on ownership holds for tokenization – maintain sole ownership of the token vault that is used to unlock tokens and bring data back into the clear. In addition, enterprises need to recognize data heading to the cloud as having a three-phase lifecycle: data in-transit to the cloud, data at-rest being stored in the cloud, and finally data in-use being processed in the cloud. As a result, if they are only using encryption solutions that protect data at-rest, their data is only protected for one third of its full lifecycle. Steps need to be taken to protect the other two-thirds of their data’s life. This is especially important because these two areas, data in-motion and data in-use, are arguably the riskier portions of the lifecycle as many of the most famous hacks from the last five years stole data in these life-cycle phases.  Enterprises treating incomplete data encryption as a panacea leave themselves at major risk of sensitive data exposure — especially as this data makes its way into the cloud. The good news is that there are technologies that can be used to protect cloud data across all of its phases. For example, platforms exist that act as an encryption or tokenization point before data leaves the control of the enterprise and goes to the cloud provider. The data itself – on the way to the cloud, while being stored in the cloud, and while being processed in the cloud – is always encrypted and protected and only the enterprise can bring the information back into useable form.  These Cloud Data Protection products are part of the emerging CASB category and should be on every CIO’s list of technologies to consider for ensuring secure and compliant cloud adoption in their enterprises.

Agency: Cordis | Branch: FP7 | Program: CSA | Phase: ICT-2011.1.4 | Award Amount: 779.12K | Year: 2012

Certification, InteRnationalisation and standaRdization in cloUd Security (CIRRUS) aims to bring together representatives of industry organizations, law enforcement agencies, cloud services providers, standard and certification services organizations, cloud consumers, auditors, data protection authorities, policy makers, software component industry etc. with diverse interests in security and privacy issues in cloud computing.\n\nDifferent stakeholders have different expectations, views or requirements related to cloud computing. Users are worried about data portability or cloud interoperability, to ensure privacy and security when migrating their data from one cloud to another. Concerns about security in the cloud can prevent certain users, such as critical infrastructure operators, from moving their data to the cloud. Challenges coming from national legislations and cross-country agreements need to be faced by data law enforcement agencies, whilst compliance, auditing and certification are important for ICT service providers when dealing with their cloud related business. Loss of control, confidentiality, auditing and compliance implications are main concerns for Chief Information Officers.\n\nCIRRUS Consortium and Advisory Board are bringing representatives of these stakeholders together. It has an excellent balance of academic, private and public partners that enable balancing of their needs and views while maintaining the vision and high-level objectives such as bringing research project results to the market or improving trust in cyberspace.\n\nCIRRUS clouds are among the highest altitude clouds in troposphere: CIRRUS project also aims to provide high-level, high-impact support and coordination for European ICT security research projects. Project activities target joint standardization, certification schemes, link research projects with EU policy and strategy, internationalization, as well as industry best practices and public private cooperation initiatives.

Discover hidden collaborations