Posts

Amazon announced details about their Q2 earnings yesterday. Their cloud business grew with incredible 81%. This is massive, given the fact that Amazon is already the number #1 company in that area. This quarter, they earned 1.8 billion USD from cloud computing.
Summing up this number, their revenue would definitively reach some 7 billion this year. However, if this growth continues to increase so fast, I guess they could even get double-digit by the end of this year. Will Amazon reach 10 billion in 2015? If so, this would be incredible! Microsoft stated that their growth was somewhere well above the 100% mark, so I am interested in where Microsoft will stand by the end of the year.
But what does this tell us? Both Microsoft and Amazon are growing fast in this business and we can expect that we will see many more interesting services in the coming month and years in the Cloud. My opinion is that the market is already consolidated between Microsoft and Amazon. Other companies such as Google and Oracle are rather niche players in the Cloud market.

Cloud Computing gave us several changes in how we handle IT nowadays. Common tasks that used to take a lot of time received great automation and much more is still about to come. Another interesting development is the “Software defined X”. This basically means that infrastructure elements receive larger automation as well, which ends up being more scale able and better to utilize from applications. A frequent term used lately is the “Software defined Networking” approach, however, there is another one that sounds promising, especially for Cloud Computing and Big Data: Software defined Storage.
Software defined Storage gives us the promise to abstract the way how we use storage. This is especially useful for large scale systems, as no one really wants to care about how to distribute the content to different servers. This should basically be opaque to end-users (software developers). For instance, if you are using a storage system for your website, you want to have an API like Amazon’s S3. there is no need to worry about on which physical machine your files are stored – you just specify the desired region. The back-end system (in this case, Amazon S3) takes care of that.

Software defined Storage explained

Software defined Storage explained


As of the architecture, you simply communicate with the abstraction layer, that takes care of the distribution, redundancy and other factors.
At present, there are several systems available that takes care of that: next to the well-know systems such as Amazon S3, there are also other solutions such as the Hadoop Distributed File System (HDFS) or GlusterFS.
 
Header Image Copyright: nyuhuhuu. Licensed under the Creative Commons 2.0.

Amazon Web Services today announced their new Datacenter for Germany, Frankfurt. This is AWS region number 11 and the second in Europe. AWS will support a large number of services from that datacenter.
Here is the original press release:

SEATTLE—Oct, 23, 2014– (NASDAQ:AMZN) — Amazon Web Services, Inc. (AWS, Inc.), an Amazon.com company, today announced the launch of its new AWS EU (Frankfurt) region, which is the 11th technology infrastructure region globally for AWS and the second region in the European Union (EU), joining the AWS EU (Ireland) region. All customers can now leverage AWS to build their businesses and run applications on infrastructure located in Germany. As with every AWS region, customers can do this knowing that their content will stay within the region they choose. The newly launched AWS EU (Frankfurt) region comes as a result of the rapid growth AWS has been experiencing and is available now for any business, organization or software developer to sign up and get started at: http://aws.amazon.com.

All AWS infrastructure regions around the world are designed, built, and regularly audited to meet rigorous compliance standards including, ISO 27001, SOC 1 (Formerly SAS 70), PCI DSS Level 1, and many more, providing high levels of security for all AWS customers. AWS is fully compliant with all applicable EU Data Protection laws, and for customers that require it, AWS provides data processing agreements to help customers comply with EU data protection requirements. More information on how customers using AWS can meet EU data protection requirements and local certifications such as BSI IT Grundschutz, can be found on the AWS Data Protection webpage at: aws.amazon.com/de/data-protection. A full list of compliance certifications can be found on the AWS compliance webpage at: http://aws.amazon.com/compliance/.

The new AWS EU (Frankfurt) region consists of two separate Availability Zones at launch. Availability Zones refer to datacenters in separate, distinct locations within a single region that are engineered to be operationally independent of other Availability Zones, with independent power, cooling, and physical security, and are connected via a low latency network. AWS customers focused on high availability can architect their applications to run in multiple Availability Zones to achieve even higher fault-tolerance. For customers looking for inter-region redundancy, the new AWS EU (Frankfurt) region, in conjunction with the AWS EU (Ireland) region, gives them flexibility to architect across multiple AWS regions within the EU.

“Our European business continues to grow dramatically,” said Andy Jassy, Senior Vice President, Amazon Web Services. “By opening a second European region, and situating it in Germany, we’re enabling German customers to move more workloads to AWS, allowing European customers to architect across multiple EU regions, and better balancing our substantial European growth.”

Many German customers are already using AWS including Talanx, in the highly regulated insurance sector. Talanx is one of the top three largest insurers in Germany and one of the largest insurance companies in the world with over €28 billion in premium income in 2013. “For Talanx, like many companies that hold sensitive customer data, data privacy is paramount,” says Achim Heidebrecht, Head of Group IT, Talanx AG. “Using AWS we are already seeing a 75% reduction in calculation time, and €8 million in annual savings, when running our Solvency II simulations while still complying with our very strict data policies. With the launch of the AWS region on German soil, we will now move even more of our sensitive and mission critical workloads to AWS.”

Hubert Burda Media is one of the largest media companies in Europe with over 400 brands and revenues in excess of $3.6 billion. JP Schmetz, Chief Scientist of Hubert Burda Media said of the announcement, “Now that AWS is available in Germany it gives our subsidiaries the option to move certain assets to the cloud. We have long had policies preventing data to be hosted outside of German soil and this new German region gives us the option to use AWS more meaningfully.”

Academics in Germany were also quick to welcome the new region, “The arrival of an Amazon Web Services Region in Germany marks an important occasion for the German business and technology community,” said Prof. Dr Helmut Krcmar, Vice Dean of the Computer Science Faculty, and Chair of Information Systems at the Technical University of Munich. “We work with a number of DAX listed companies in Germany. Many have been holding off moving sensitive workloads to the cloud until they had computing and service facilities on German soil as this could help them comply with their internal processes. This new region from AWS answers this and we expect to see innovation amongst Germany, and Europe’s, companies flourish as a result.”

The Header Image was published by Martin aka Maha under the Creative Commons License.

The AWS Java SDK Version 1.8.10 comes with a critical bug, affecting uploads. A fix was provided by AWS and normally the SDK is updated automatically, so you don’t need to worry.
However, if automatic updates are disabled in your Eclipse Version, you might loose data when uploading via the SDK Version 1.8.10. Here is what AWS has to say about the bug:
//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js //

AWS Message

Users of AWS SDK for Java 1.8.10 are urged to immediately update to the latest version of the SDK, version 1.8.11.
If you’ve already upgraded to 1.8.11, you can safely ignore this message.
Version 1.8.10 has a potential for data loss when uploading data to Amazon S3 under certain conditions. Data loss can occur if an upload request using an InputStream with no user-specified content-length fails and is automatically retried by the SDK.
The latest version of the AWS SDK for Java can be downloaded here:
http://aws.amazon.com/sdk-for-java/
And is also available through Maven central:
http://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk/1.8.11/


The bug itself is repaired, in case you didn’t update the AWS SDK and are on the SDK Version 1.8.10 you should update that. Normally, the AWS SDK updates itself automatically in Eclipse.

The Blog I started on Cloud Computing and Big Data some years ago was steadily increasing in the number of people accessing it. CloudVane is also named one of the Top 100 Blogs on Cloud Computing (Source), which is backed by the number of visits I get per day. To meet the increased traffic, I had to scale up my Blog.
There was no question that I am going to use some kind of Cloud Platform. To date, I used Amazon Web Services. As I am always keen on using the newest technology, I decided to use a Platform as a Service Provider. The reasons for that vary: the most important factor is that I don’t want to take care of VM management and alike. The most important aspect I was looking at is to have a platform that eases administration. Ideally I would have only little administration or no administration.
I looked at the 3 most common platforms: Amazon Elastic Beanstalk, Google AppEngine and Windows Azure. After playing with all 3 platforms, doing load-tests, comparing pricing and looking at the scalability aspects of the platform, I decided to use Windows Azure. To me it seemed to be the most mature platform in terms of PaaS (this is my personal opinion after doing some research and don’t represent the opinion of my employer). Windows Azure Web Sites is very easy to handle and the features it offers are great.
Moving to Windows Azure Web Sites was straight forward: I created a WordPress instance from the templates provided in the Windows Azure gallery. After 2 steps of configuration, WordPress was ready to go. I did the 1-click setup by WordPress. The hardest part of the migration was to move the existing blog entries to the new blog; thanks to the Import/Export capabilities of WordPress, this was done within short time as well. Installing the plugins and so on took some more hours, but it went smooth as well.
In the next posts, I will talk about performance and setup/architecture of WordPress on Windows Azure.
Header Image Copyright by: leolintang

I will post some Developer content from now on, with a focus to some easy but helpful tasks when working with various Cloud Platforms. These tips will be named after the service (e.g. Amazon Web Services for AWS, …)
The first tip I want to show is how to retrieve the full queue URL when you already have the queue name:

sqs.getQueueUrl(new GetQueueUrlRequest().withQueueName("myqueue")).getQueueUrl();

The function “getQueueUrl()” already returns a String-representation and not a URI itself (this is what I would rather expect in that case)