We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this special issue, we have collected eight articles that offer new points for research on information and communications technology (ICT)-based systems. We focused on the intuitive nature of the relationship between new ICT-based systems and contemporary management, forming an integrative unit of analysis instead of focusing solely on new ICT-based systems and leaving contemporary management as a moderating or mediating factor. This special issue promoted interdisciplinary research at the intersection of new ICT-based systems and contemporary management, including cybernetics systems and knowledge management, service managing and the Internet of things, cloud and marketing management, business process re-engineering and management, knowledge management, and strategic business management, among others.
This chapter examines the array of technologies that is transforming the global TV system. The first part is devoted to communications satellites, which fulfil multiple distribution functions for all kinds of rights holders, and the second turns to internet distribution. It covers the origins of video streaming before explaining how it works and why it is dethroning broadcasting and downloading as the most popular way of accessing content. The chapter examines the role of video coding formats, content delivery networks (CDNs), and cloud computing in video distribution. It concludes by highlighting the role of standards and standard-setting organisations, arguing that their international evolution mirrors that of the TV industry and emphasising the crucial role they play in digital value chains over which no one in particular has an oversight.
The uptake of electric vehicles (EVs) and renewable energy technologies is changing the magnitude, variability, and direction of power flows in electricity networks. To ensure a successful transition to a net zero energy system, it will be necessary for a wide range of stakeholders to understand the impacts of these changing flows on networks. However, there is a gap between those with the data and capabilities to understand electricity networks, such as network operators, and those working on adjacent parts of the energy transition jigsaw, such as electricity suppliers and EV charging infrastructure operators. This paper describes the electric vehicle network analysis tool (EVENT), developed to help make network analysis accessible to a wider range of stakeholders in the energy ecosystem who might not have the bandwidth to curate and integrate disparate datasets and carry out electricity network simulations. EVENT analyses the potential impacts of low-carbon technologies on congestion in electricity networks, helping to inform the design of products and services. To demonstrate EVENT’s potential, we use an extensive smart meter dataset provided by an energy supplier to assess the impacts of electricity smart tariffs on networks. Results suggest both network operators and energy suppliers will have to work much more closely together to ensure that the flexibility of customers to support the energy system can be maximized, while respecting safety and security constraints within networks. EVENT’s modular and open-source approach enables integration of new methods and data, future-proofing the tool for long-term impact.
Attendance is critical to the success of any business or industry. As a result, most businesses and institutions require a system to track staff attendance. On the other hand, cloud computing technology is being utilized in the human resource management sector. It may be an excellent option for processing and storing large amounts of data and improving management effectiveness to a desirable level. Hence, this paper examines cloud infrastructures for employee attendance management in which the articles are categorized into three groups. The results show that cloud infrastructure has a significant and positive impact on the management of employee attendance systems. Also, the results reveal that the radio frequency identification authentication protocol protects the privacy of tags and readers against database memory. When references operate properly, they help the people concerned and society by making workplaces more efficient and safer.
Teaching formats are constantly evolving over the years to adapt to newer methods of student learning. In the ancient times, the practice of ‘Gurukul System’ where students would go and live with a teacher and learn all that teacher would know and practice by listening and observing. Here the method adopted was more tacit-to-tacit knowledge transfer between the teacher and the students. The learning here used to be a function of student’s ability to absorb skill and knowledge, and the evaluation used to be a function of real demonstration of student’s ability. This method has changed in the last hundred years where a more formal schooling system has evolved to get both teachers and students in one place, and the knowledge is transferred explicitly in the form of a prescribed curriculum. The learning in a ‘classroom’ environment is more formatted and the evaluation is based on formal assessments. In the recent years, lot of interest has been developed to innovate the teaching formats so that a ‘student centric learning’ can be practiced in a classroom environment. As described by Kaplan (2021) in this book’s first chapter, changes triggered by the COVID 19 pandemic have impacted the Education Sector extensively giving birth to a digital mode of instruction as the new normal. The digital platform and the transformation it can bring into the practice has created new interest among the teaching community to explore better teaching formats.
Modern digital life has produced big data in modern businesses and organizations. To derive information for decision-making from these enormous data sets, a lot of work is required at several levels. The storage, transmission, processing, mining, and serving of big data create problems for digital domains. Despite several efforts to implement big data in businesses, basic issues with big data remain (particularly big-data management (BDM)). Cloud computing, for example, provides companies with well-suited, cost-effective, and consistent on-demand services for big data and analytics. This paper introduces the modern systems for organizational BDM. This article analyzes the latest research to manage organization-generated data using cloud computing. The findings revealed several benefits in integrating big data and cloud computing, the most notable of which is increased company efficiency and improved international trade. This study also highlighted some hazards in the sophisticated computing environment. Cloud computing has the potential to improve corporate management and accountants' jobs significantly. This article's major contribution is to discuss the demands, advantages, and problems of using big data and cloud computing in contemporary businesses and institutions.
This chapter introduces the subject matter of the book, provides the core problem statement and defines the central terms used in the book. The introduction also explains the focus on governmental adoption of cloud computing services, legal sources, and the research approach.
The introduction explains how cloud computing has made it possible and desirable for users, such as businesses and governments, to migrate their data to be hosted on infrastructure managed by third parties. The chapter further outlines why aspects of migration to cloud services pose specific legal, contractual, and technical challenges for governments.
The chapter further outlines the challenge of addressing contracting and procurement requirements, data privacy and jurisdictional obligations when using an opaque, global, multi-tenant technology such as cloud computing.
This chapter contains the second part of book’s study on cloud computing contracts.
The chapter examines how general contract law, as defined in the chapter, will likely apply to the use of cloud computing services. This chapter focuses on terms that are often considered standard in cloud agreements. The analysis includes terms aimed at keeping information confidential, non-disclosure agreements, terms regarding liability, warranties, and other terms and conditions aimed at regulating or limiting responsibility. Additionally, the chapter considers terms aimed at termination of services, portability and other provisions necessary for exiting services.
In addition to offering an evaluation of specific contract terms, the chapter also evaluates how governments might create better cloud computing contracts to generate more consistent and compliant results.
This chapter evaluates the key data protection requirements and compliance obligations that governments must account for when entering into contracts with cloud service providers. The chapter concentrates on data protection issues that pose particular barriers for governments attempting to adopt cloud-computing services.
The chapter focuses primarily on understanding how the General Data Protection Regulation (GDPR) impacts the use of cloud computing. This requires an analysis of applicability and jurisdiction, applications of principles, understanding roles and responsibilities under the law, contractual obligations on sub-processors, liability for compliance, and limits on data transfers among others. The chapter also provides an overview of US data privacy law.
The chapter further evaluates recent case law and guidance from the European Data Protection Board (EDPB) and national data protection authorities to draw conclusions regarding GDPR cloud compliance obligations. Specifically, the chapter focuses on challenges and limits to cross-border transfers of data following the CJEU decision in the “Schrems II” case.
This chapter provides an overview of cloud computing technology. The explanation includes an overview of the differences between traditional outsourcing and cloud computing and how server virtualization makes cloud computing possible. The chapter also identifies the major players in the provision of cloud computing services and the primary cloud computing service and deployment models. The chapter evaluates central security concerns and risks including loss of availability and risks to data portability.
This chapter evaluates the application of jurisdictional principles to cloud computing services and the core challenges for governments and others. The chapter considers the interplay of jurisdiction—the ability of a court to hear a dispute—in the context of physical location, intelligible access to data, and the physical location of servers.
In particular, the chapter focuses on areas of uncertainty, such as the categorization of services and the location of data and limits to current approaches. The chapter argues that the traditional territorial approach to jurisdiction is a poor fit to account for the properties of cloud computing services and data more generally arguing that data poses unique legal challenges to applying traditional jurisdiction principles.
The chapter provides an analysis of access to cloud computing services for law enforcement and intelligence purposes by the US government. This includes an analysis of the “Microsoft Warrant” case, the US CLOUD Act and its possible conflicts with the General Data Protection Regulation (GDPR), and access by US intelligence agencies under FISA Section 702 and Executive Order 12333.
This chapter contains the first part of the book’s study on cloud computing contracts evaluating the organization and structure of cloud computing contracts in addition to their content. This includes an evaluation of Service Level Agreements (SLAs), the use of master-service and framework agreements, issues related to subcontractors and subcontracting, third-party rights, and liability considerations.
The study applies a qualitative analysis of based on both secondary and original data. Secondary data is derived from various research projects in the EU and elsewhere. Original study data is derived from contracts obtained by the author through Freedom of Information (FOI) requests. This study is original in its method and scope in the governmental context. Additionally, the chapter applies government cloud audits and other guidance form the UK G-Cloud and US FedRAMP programs.
In Government Cloud Procurement, Kevin McGillivray explores the question of whether governments can adopt cloud computing services and still meet their legal requirements and other obligations to citizens. The book focuses on the interplay between the technical properties of cloud computing services and the complex legal requirements applicable to cloud adoption and use. The legal issues evaluated include data privacy law (GDPR and the US regime), jurisdictional issues, contracts, and transnational private law approaches to addressing legal requirements. McGillivray also addresses the unique position of governments when they outsource core aspects of their information and communications technology to cloud service providers. His analysis is supported by extensive research examining actual cloud contracts obtained through Freedom of Information Act requests. With the demand for cloud computing on the rise, this study fills a gap in legal literature and offers guidance to organizations considering cloud computing.
In this paper, we develop a novel game theoretic model of the interactions between an EDoS attacker and the defender based on a signaling game that is a dynamic game of incomplete information. We then derive the best defense strategies for the network defender to respond to the EDoS attacks. That is, we compute the perfect Bayesian Nash Equilibrium (PBE) of the proposed game model such as the pooling PBE, separating PBE and mixed strategy PBE. In the pooling equilibrium, each type of the attacker takes the same action and the attacker's type is not revealed to the defender, whereas in the separating equilibrium, each type of the attacker uses different actions and hence the attacker's type is completely revealed to the defender. On the other hand, in the mixed strategy PBE, both the attacker and the defender randomize their strategies to optimize their payoffs. Numerical illustration is also presented to show the efficacy of the proposed model.
The rapid progress in the performance of today's sophisticated transmission electron microscopes also demands computational and educational tools capable of simulating the intricacy of electron image formation. The tools ideally should be widely accessible to the microscopy community and capable of covering both the breadth and depth demanded by modern materials science. Here, a cloud-based microscopy simulation platform, called cloudEMAPS and powered by cloud computing and modern server-client web programming architecture, is described. Compared to the current desktop solutions for electron microscopy simulations, cloud computing offers the unique advantages of on-demand, data sharing, high-performance computation and internet easy access using the state-of-the-art web infrastructure. This article will demonstrate these advantages using examples of interactive simulations of electron diffraction patterns and aberration-corrected electron optics.
The Expanded Program for Immunization Consortium – Human Immunology Project Consortium study aims to employ systems biology to identify and characterize vaccine-induced biomarkers that predict immunogenicity in newborns. Key to this effort is the establishment of the Data Management Core (DMC) to provide reliable data and bioinformatic infrastructure for centralized curation, storage, and analysis of multiple de-identified “omic” datasets. The DMC established a cloud-based architecture using Amazon Web Services to track, store, and share data according to National Institutes of Health standards. The DMC tracks biological samples during collection, shipping, and processing while capturing sample metadata and associated clinical data. Multi-omic datasets are stored in access-controlled Amazon Simple Storage Service (S3) for data security and file version control. All data undergo quality control processes at the generating site followed by DMC validation for quality assurance. The DMC maintains a controlled computing environment for data analysis and integration. Upon publication, the DMC deposits finalized datasets to public repositories. The DMC architecture provides resources and scientific expertise to accelerate translational discovery. Robust operations allow rapid sharing of results across the project team. Maintenance of data quality standards and public data deposition will further benefit the scientific community.
When economists talk about ‘measurement’ they tend to refer to metrics that can capture changes in quantity, quality and distribution of goods and services. In this paper we argue that the digital transformation of the economy, particularly the rise of cloud computing as a general-purpose technology, can pose serious challenges to traditional concepts and practices of economic measurement. In the first part we show how quality-adjusted prices of cloud services have been falling rapidly over the past decade, which is currently not captured by the deflators used in official statistics. We then discuss how this enabled the spread of data-driven business models, while also lowering entry barriers to advanced production techniques such as artificial intelligence or robotic-process-automation. It is likely that these process innovations are not fully measured at present. A final challenge to measurement arises from the fragmentation of value chains across borders and increasing use of intangible intermediate inputs such as intellectual property and data. While digital technologies make it very easy for these types of inputs to be transferred within or between companies, existing economic statistics often fail to capture them at all.
This chapter first introduces the basic concept of the cloud computing and cloud networking. A general cloud network architecture is presented and follows by the specific cloud systems, i.e., cloud data center networking, mobile cloud networking, and edge computing. Then, the chapter presents a survey on the game theoretic and auction models developed and applied to solve issues in cloud networking. Such issues include bandwidth reservation and allocation, request allocation, wireless bandwidth allocation, resource management in edge computing, and bandwidth allocation in software defined networking for cloud computing. The chapter then presents a cooperative game model for mobile cloud resource management in which the full formulation, algorithms, and performance evaluation are included. Finally, the chapter investigates how to provide efficient insurance in cloud computing market.
This article discusses the image processing challenges in modern microscopy and microanalysis associated with large dataset size, microstructure complexity, and growing computing requirements. Solutions for meeting these challenges include artificial intelligence and cloud computing, which provide improved efficiency in managing microscopy data, more robust automated image segmentation, and prediction of physical properties from images with user-friendly high-performance computing. The applications of these technologies in the industrial research environment are exemplified by studies in evaluation of amorphous drug formulations, tight rock characterization for sustainable hydrocarbon extraction, and low-temperature fuel cell design for an environment-friendly automobile.
G-Network queueing network models, and in particular the random neural network (RNN), are useful tools for decision making in complex systems, due to their ability to learn from measurements in real time, and in turn provide real-time decisions regarding resource and task allocation. In particular, the RNN has led to the design of the cognitive packet network (CPN) decision tool for the routing of packets in the Internet, and for task allocation in the Cloud. Thus in this paper, we present recent research on how to dynamically create the means for quality of service (QoS) to end users of the Internet and in the Cloud. The approach is based on adapting the decisions so as to benefit users as the conditions in the Internet and in Cloud servers vary due to changing traffic and workload. We present an overview of the algorithms that were designed based on the RNN, and also detail the experimental results that were obtained in three areas: (i) traffic routing for real-time applications, which have strict QoS constraints; (ii) routing approaches, which operate at the overlay level without affecting the Internet infrastructure; and (iii) the routing of tasks across servers in the Cloud through the Internet.