User:Stinglehammer/sandbox16

Introduction In July 2016, Forbes published an article entitled ‘The trillion dollar tech war’. The cloud computing market, dominated by Amazon Web Services since 2006, has developed in the decade that followed into a “tsunami” (Konrad, 2016) that is revolutionising how businesses think about and use technology; from start-ups like Airbnb, Instagram and Pinterest to corporations like GE, NBC and Shell. Now, nearly every business is looking at cloud computing, with exponential growth in the market forecast: “As the data-intensive Internet of Things becomes a reality, the cloud is shaping up to be the biggest business opportunity in a generation, And Amazon, Microsoft and Google all want to claim the biggest slice…. the real winner from all this competition is hopefully the customers” (Konrad, 2016)

In the UK, five years earlier, the Archives and Record Association produced a report on cloud computing which demonstrated that “cloud computing is poised on the brink of ubiquity” (Cumming, 2011). The ubiquity of cloud computing is only set to get a lot more pronounced in the years that follow with vendors “aggressively moving to a cloud-first roadmap” (McKinnon, 2015). Last year, the global cloud computing sector grew by 42% to reach a value of $63,707.1 million and become one of the fastest growing sectors of technology. It is forecast to have a value of $347,723.8 million by 2020, an increase of 445.8% since 2015 (Global Cloud Computing Industry Profile, 2015). While the USA accounts for 48.6% of the global computing market, the UK, Germany & France are the leading markets in Europe. Cloud computing in the UK grew by 39.9% in 2015 to reach a value of $3,455.3 million while it is forecast to have a value of $16,944.5 million by 2020, an increase of 390.4%. (Cloud Computing Industry Profile: UK, 2015). Further, according to Cisco Global Cloud Index 2014–2019,”86% of workloads will be processed by cloud data centers, leaving only 14% to be processed by traditional data centers.” (Datskovsky, 2016) The cloud is here to stay. And the records professional needs to adapt quickly to this new paradigm shift.

A multitude of benefits Not least because the cloud offers a multitude of benefits which are very attractive to businesses. The chief one being that organisations can access virtually unlimited computing power and storage capacity enabling them to dispense with vast expenditure on new hardware & software (Ferguson-Boucher, 2011). For organisations with limited IT expertise, or limited potential to develop their IT infrastructure, this can be a way of future-proofing their organisation. Further benefits of the cloud include: services can be provided on-demand as required; the cloud network can be accessed quickly & easily from anywhere around the world; there is no long-term commitment; storage capacity can be increased or decreased as necessary; organisations can scale up or scale down their cloud operations as necessary for peak business times; it’s easy to run experimental initiatives with little in the way of capital or infrastructure requirements; an organisation’s own in-house IT can be re-purposed to look after more “business-critical tasks” (Ferguson-Boucher, 2011); increased reliability; enhanced security; storing information in multiple locations can help prevent loss of information; and business continuity/disaster recovery contingencies can be employed through cloud backups e.g. The City of Los Angeles employing a cloud backup of their Emergency Operation Centre, which “costs the city just $300 a month, saves time, and maintains operations when they’re needed most…. We can set up a virtual EOC anytime, anywhere.” (Zaleski, 2016).

Understanding the risks of the cloud “Cloud computing as a new delivery model is proving challenging for recordkeeping professionals” (Ferguson-Boucher and Convery, 2011)

It is worth remembering, as Cunningham (2016) and Borglund (2015) assert, that the cloud is not inherently problematic & that physical archives can every bit as risky as the cloud, if not riskier. That being said, part of the reason that adoption of cloud services “remains in a nascent stage” (McKinnon, 2015) seems to be because the risks associated with the cloud are more “fuzzy and difficult to understand” (Borglund, 2015) potentially because of the ‘black box syndrome’ associated with CSPs.

The InterPARES Trust project outlines a few of the challenges facing recordkeeping professionals on its “Records in the Cloud” project webpage: “We have seen cloud providers go bankrupt, disappear or be sold; records being lost, retained when should have been destroyed, or mixed-up in shared servers; failed back-up; and unauthorised access by sub-contractors and hackers. Further, it is impossible to pinpoint the geographical location of the records at any given time and the jurisdiction under which they fall; to prove the chain of custody and the authenticity of the records; to ensure protection of legal privilege or trade secrets when using a third party; to isolate documents for legal hold; to conduct audits; and to guarantee that the records to be permanently preserved are kept according to archival standards.” (InterPARES, 2016) Moving to cloud computing is presented as a way of reducing capital expenditure whilst also maximising computing power & resources. Yet there are still costs involved in terms of: the preparations involved in implementing cloud services with existing business processes; enabling multi-cloud environments to work with one another; transferring information between multi-cloud environments where proprietary application programming interfaces (API) are being employed; day-to-day management of the cloud network(s); implementing specific security, performance-monitoring and records management software within the cloud almost certainly add to the cost (Ferguson-Boucher and Convery, 2011); and there may also be costs related to the cloud service provider being able to “analyse and export needed data” on demand (Boyd, 2014). Further, while increased reliability of service is a perceived benefit of utilising the cloud, the lack of control when service outages do occur can be hugely problematic. Indeed, control appears to one of the thorniest issues relating to cloud service providers (CSP) so due diligence prior to any ceding of control is key. •	Boyd (2014) points out that the implications involved in regaining control of your organisation’s data following the end of an agreement, or indeed should the worst happen and the CSP files for insolvency, could be horrendous unless the initial setup agreement has taken this into consideration and due diligence has been undertaken to mitigate this risk. •	Outsourcing the security for your organisation’s files can be a risky strategy too if the CSP does not perform ‘due diligence’ in this regard, particularly in relation to access permissions & data privacy. •	The geographical location of the organisation’s data can also be critically important in terms of compliance with data regulations and also in terms of ensuring business continuity so again ‘due diligence’ is warranted in checking these issues upon the setup of an agreement with a CSP. •	Who owns the data? This potential for loss of control can affect data security and record authenticity so needs to be understood in granular detail, down to the metadata created in each transaction, prior to the signing of the setup agreement. Afterwards, it may be too late and result in a permanent loss of ownership.

Responding to the challenges The main challenges for the information professional, amount to: “information retrieval and destruction; loss of control over information and data protection” (Ferguson-Boucher and Convery, 2011) Yet, these issues are far from being new ones for the records management professional. Applying rigorous due diligence at the outset prior to agreeing terms with a cloud service provider, can circumvent these issues. Further, the ARA’s Clod Computing toolkit (Convery, 2011) provides enough guidance to inform the most recalcitrant RIM professional where: “The detail provided is excellent – in depth, comprehensive and relevant, giving real insight and specific suggestions for mitigating risks and implementing effective cloud arrangements…. [presenting] three case studies, each providing a very positive perspective on the process and organisational efficiencies that can be achieved using cloud computing frameworks. The final section contains a consolidated listing of the more than 200 assessment questions posed throughout the toolkit. In combination like this, these questions provide a very comprehensive basis to help organisations assess their readiness and risk tolerances for cloud computing.” (Cumming, 2011) One of the chief concerns of working with a cloud service provider, as put forward by Ferguson-Boucher and Convery (2011), relates to ensuring compliance with the Data Protection Act 1998. This throws up a number of challenges in the cloud environment in that compliance depends on: knowing where information is physically stored (often in data centres around the world); how it is protected from unauthorised access (outsourced to the CSP); ensuring personal information can be accessed in a timely fashion according to data access requests (outsourced to CSP); and ensuring that personal information is not kept for longer than necessary (outsourced to CSP). Yet, in saying this, most cloud providers are themselves seeking compliance with standards such as FISMA (2002) and ISO27001 and have already, or are in the process of, changed their terms & conditions in order to specify the location of where information is stored. (Ferguson-Boucher and Convery, 2011). In addition, the UK’s Information Commissioner’s Office (ICO) has produced a 24 page booklet, ‘Guidance on the use of cloud computing’, for organisations to follow in order to fully ensure compliance with the Data Protection Act (1998). Combining this guidance from the ICO with the ARA’s toolkit (Convery, 2011) and other literature already plentifully available on how to mitigate risk in the setting up an agreement with a cloud service provider, there should be no reason for the records professional to go into the process feeling daunted.

Hence, far from Lappin’s (2010) lament about the records manager heading back to the basement once again, articulating the need for a RM approach to any proposed agreement with a cloud service provider should place the records professional firmly out of the basement and into the boardroom as it requires a holistic RM approach to how an organisation manages its business processes with the CSP. Hence, the records manager must be involved at the outset to embed best archival practice. Cloud Agreements – setting up for success The due diligence the RM performs at the agreement’s outset will set the tone for the organisation’s involvement with the CSP pre-agreement, during the agreement, and post-agreement so it is vitally important the records professional goes into negotiations fully-informed with eyes wide open as to the risks & pitfalls.

Negotiations with CSPs maybe difficult for smaller organisations as Oppenheim (2012) suggests or nigh-on impossible where the internet infrastructure just isn’t available to make the agreement viable or competition is so reduced to make the terms of agreement unappealing. Yet, “proportion not (necessarily) perfection” (McLeod, 2012) will inevitably be the only way forward in all cases in this post-custodial digital world, wherein records managers “actively contribute their archival knowledge to the development of cloud-based services to influence cloud computing service providers’ practices” (McKemmish in Guo et al, 2016).

Trust therefore is the most important issue as, whichever provider/model of cloud(s) an organisation opts for, it is vitally important to set up a contract or agreement that increases trust. Borglund (2015) suggests that focusing on information security and the management of information security is the one strategy that implicitly increases trust.

Cunningham (2016) argues that if the risks are clearly understood and proper contractual agreements are put in place then there is no reason why agreements with CSPs should be problematic. With this in mind, Ferguson-Boucher (2011) developed ten questions to ask when outsourcing to the cloud (included in Appendix A). These relate to: identifying & mitigating areas of risk; monitoring requirements; legislative compliance; ensuring the integrity & authenticity of records; transparency over costs; transparency over access and future renegotiations etc. Depending on the answers, this will increase trust in the enterprise or warrant further investigation into other models, or indeed shelving the idea until such time as the CSP, the infrastructure or the regulatory framework is deemed sufficiently mature enough to revisit the idea once more. For Stancic et al (2015), the ‘trusted cloud service’ can be further enhanced through examining the concept of the ‘electronic document safe’ (EDS) – a “secure storage for official documents” whereby documents are a) fragmented and distributed among many CSP’s to make it more difficult to access the original document and b) only accessible through encrypted communication & authentication (possibly using the additional security blockchain encryption offers). This depends on two issues being addressed: privacy protection and long-term service availability. Of these, the long-term availability of the cloud service provider is the greatest unknown, and one which no private company could credibly guarantee. Hence, Stancic et al’s (2015) proposal to mitigate this risk is for a governmental cloud-provider to be the fall-back position for business continuity. In the UK, the government are already implementing the ‘G-cloud’ framework “built on the government’s own existing infrastructure which ensures that some security risks, such as multi-tenancy and data centre locations, can be avoided” (Ferguson-Boucher and Convery, 2011) so it is not outwith the realms of possibility that they could ensure the trustworthiness & authenticity of records are preserved in the longer term. However, this would obvious depend on the political will to do so when budgetary constraints are clearly making governments look at outsourcing in the first place. Past research - the problem of trust •	The eight-month research project for the Archives & Records Association at Aberystwyth University in 2010 found that the implementing of cloud computing was at an “embryonic” stage where lack of trust “emerged as a main factor stalling the adoption of cloud computing.” (Ferguson-Boucher and Convery, 2011) •	Guo (2016) found that in the case of the Tianjin Municipal Archives, cloud computing addressed serious storage and capacity issues however they were cautious about pursuing due to “the inherent risks and the fact that there are not yet mature solutions and policies to mitigate these risks.” •	Franks (2015) describes the InterPARES Trust research project (2013–18) where several studies were approved (and some are due to publish their findings in the next year). Two key areas recommended to enshrine in the CSP service agreement were the preservation of metadata and enforcement of retention periods. •	Askhoj et al (2011) evince that, to preserve digital records in the cloud, the repurposing the Open Archival Information System (OAIS) with a platform-as-a-service (PaaS) layer, a software-as-a-service (SaaS) layer, a preservation layer, and an interaction layer is one way forward.

Enabling trust through the changing role of the record professional

‘‘The archivist has become more of a guardian of the entire organization’s archival interests than merely guardian of the records.’’(Borglund, 2015)

Borglund (2015) argues that managing digital records in the cloud requires a fundamental change in the role of the records manager; moving from a largely reactive role protecting an organisation’s records to pro-actively ensuring the whole organisation’s interests are enshrined in the CSP agreement from day one. Ensuring trustworthy & reliable records also depends on record managers understanding & trusting the cloud service providers in terms of how information is stored and moved in this networked environment. This requires the cloud service provider to demystify the processes associated with its practices at the same time as up-skilling the record manager to have enough “competence and knowledge to define about cloud services and IT to define the requirements for a cloud service.” (Borglund, 2015). Once effective communication between records managers and CSPs has been achieved then the records stored in cloud networks may even exceed that of other in-house IT environments in terms of their trustworthiness. That depends, however, on the records professional working ever more closely with the IT department to ensure the records are monitored effectively. Rather than seeing the rise of cloud services as a “wicked problem” (McLeod, 2014) we should instead embrace the change and rise to the challenge of developing “new, relevant, and up-to-date methods” (Borglund, 2015) which will make digital recordkeeping “compelling” (Reed, 2015).

Jurisdiction One of the greatest areas of lack of trust between records professionals and cloud service providers is one where records professional have very little control or influence; namely the risks associated with transborder data flow, which is defined in Goh (2014) as “movement across national boundaries of computerized, machine-readable data for processing, storage, or retrieval”. Employing a cloud service provider creates difficulties in terms of the multiple legal systems in play, according to which one covers: the CSP itself; the CSP’s customer; the place where the data centres in question are located; and the location of the individuals to which the data are related. Therefore, any proceedings arising from breach of copyright or privacy issues are likely to be difficult to resolve (Weber in Goh 2014) because of the multiple legal systems in play (though not impossible as several cases have already demonstrated). This is further complicated as data can flow through many jurisdictions and therefore national laws could, in theory, apply to each country the data flows through, regardless of the how quickly the data passes through. Moreover, given that 48.6% of cloud computing worldwide is conducted in the USA and the main players in this marketplace are US companies (Amazon AWS, Microsoft Azure, Google Cloud etc.) it is unsurprising that cloud service providers’ contracts are often based on US law. Therefore the data flow involved in running such CSP’s often contravenes European intellectual property & privacy law (Goh, 2014). The EU Directive on Data Protection has gone some way to alleviating concerns in this direction, however, as, in light of the directive, the USA signed the Safe Harbor Principles.

Despite this, legislation related to records management has still to catch up with the new transborder reality of the dataflow whereby an act committed in the physical territory of a country will be tried under the laws of that jurisdiction. (Ryngaert in Goh, 2014). The appropriate jurisdiction, as we’ve seen, is not always easy to ascertain. Therefore, there is a pressing need for clarity in the regulation of CSP’s in order to support the work of record management professionals.

Perhaps in response to this need, Hon et al (2016) reports that since Edward Snowden’s revelations of the surveillance undertaken by the NSA in America, there have been calls for a Europe-only cloud or ‘digital Schengen area’ to be established which would limit the possibility of data exchanges between Europe and other areas of the world. Talks reportedly took place between the German Chancellor, Angela Merkel, and President Hollande of France in 2014 on: “building up a European communication network to avoid emails and other data passing through the United States...we’ll talk about European providers that offer security for our citizens, so that one shouldn’t have to send emails and other information across the Atlantic. Rather, one could build up a communication network inside Europe”. (Hon et al, 2016)

While this may help German cloud providers secure a competitive commercial edge over their US counterparts, problems arise in that the most popular use of cloud providers is for website hosting and for SaaS services from Facebook, Yahoo, Microsoft, Amazon, Google & more. In which case, prohibiting using all these services & all non-European hosting of websites used in Germany would seem to extend the controlling nature of these parameters much further than would seem practicable. Another issue would be by what means do you define a European provider? Establishing a Trusted Cloud for Europe, by the European Commission’s European Cloud Partnership Steering Board rejected this idea of creating a ‘Fortress Europe’ model and stressed instead that: “Non-European cloud providers should be able to access the European cloud market on equal terms… as a part of the Trust [sic] Cloud Europe framework” (Hon et al, 2016) The inference being that European cloud users should use cloud providers who comply with European laws.

Breaking the logjam on jurisdiction

Perhaps existing maritime law, and applying it as a model law governing ‘cyber seas’ (Narayanan, 2012), may be the best answer to addressing this problem of jurisdiction. This is because the transborder dataflows associated with Cloud Service Providers have extraterritorial implications as much as territorial implications. Clopton (2013) defines extraterritorial as the “application of one country’s laws to persons, conduct or relationships outside of that country”. Indeed, utilising maritime law to extend the application of the law from one nation state to the law of many nations in this way may also be extremely timely given that Google, and other CSPs, have reportedly been taking out patents to build data centres on ships in international waters (Goh, 2014). Under the UN Convention on the Law of the Sea, the high seas are distinguished from a state’s own territorial waters and in this way the global cloud computing network becomes the high ‘cyber’ seas and, as such, regulated under an “overseeing international authority” (Narayanan, 2012). In these cyber seas, cloud service providers would become the ships, the jurisdictions where they base their business would become the “flag state” and the jurisdiction where they have their data centres would become the “port state”.

Conclusion “Ongoing technical work aims at increasing transparency and control for cloud customers, to enable customer trust in cloud providers to be enhanced, and to ease provider compliance. But that alone will not be enough. In the battle for governmental control of access to digital data, users and services providers, of not just cloud computing but more broadly the Internet, are being caught in the middle. There is a pressing need for governments to act in good faith to seek to resolve these problems in a workable and technologically-neutral manner.” (Hon et al, 2016) The regulatory environment is indeed a much-needed area of development where governments do need to co-operate to establish a workable framework. This is somewhat out of the records professional’s hands. Trust, however, is a key aspect governing how record managers should approach this new paradigm shift. The problems & risks identified are familiar ones in many ways and the fact they have been identified & there is so much good work being done to guide colleagues through the process of setting up agreements to mitigate these issues successfully gives one great hope for recordkeeping’s role in this new digital landscape. “If this phenomenon cannot be stopped, we must at least try to reduce its risks to an acceptable level.” (InterPARES, Records in the Cloud, 2016) I started this piece describing the rise of cloud computing, according to Konrad (2016), as a “tsunami”. That it is as unstoppable I have no doubt. Rather than the records manager retreating to the basement, I believe this phenomenon offers new opportunities to “inspire and challenge our colleagues and our organisations” (Reed, 2015) to make digital recordkeeping compelling in a way that needs must be proportionate if not perfect (McLeod, 2014) in this new landscape. Due diligence in preparing for this tsunami (examining case studies, mapping organisational processes to CSP processes, interrogating CSPs terms and conditions to secure the best possible model for the organisation) will allow the records manager to get their necessary affairs in order to batten down the hatches, secure any potential leaks and chart a course to ride the crest of the wave and set sail among the clouds.