We came up with the idea of establishing Newvem with its KnowYourCloud service about two years ago. We wanted to bring our past experience with the open source world and data analytic engine development into this new environment – the cloud. The idea evolved very fast as we understood that the infinite amount of cloud resources that are running on the amazing Amazon AWS cloud environment generates an enormous amount of data. Approaching new cloud adopters we weren’t surprised to find helpless IT leaders struggling to regain the clear visibility that they were used to having in their traditional on-premises environment.
> > > Scan Your AWS
I am really excited to report that last month was a real breakthrough with regards to exposing our service and knowledge to the world. In order to help Amazon AWS cloud customers to understand their cloud, we published our cost savings scan. We extracted the cost scan out of our analytics engine and provisioned it as a simple quick tool that finds idle and stopped instances. Our team was amazed with the aggregated results. Within the first 48 hours, after our analytics engine performed hundreds of scans, we had revealed a potential for saving about $225K in AWS users’ monthly costs.
Important step with cloud adoption is to manage quick cycles of learning and improvement of the cloud environment. The following presentation brought you by Amazon AWS guys contains great amount of slides including best practices and examples for Continuous Deployment, Optimization and Integration.
In this article I describe how we created a redundant PostgreSQL database on the Amazon cloud using EBS snapshots as backups to deploy a PostgreSQL DB server DR mobile application for one of our customers.
PostgreSQL 9.1 includes new capabilities for asynchronous fast replication syncing between master and slaves. The master server streams new data to the current available slave. This version includes great improvements that generated significant fast WAL (Write Ahead Log) processing, which generates replication and fast launching capabilities for the slave servers.
The cloud presents many security management challenges. Ensuring compliance, identity management, and other security best practices can be a challenging task. AWS Identity and Access Management (IAM) is one of the tools that can be used to mitigate the risks associated with these challenges. In this article, I will discuss a few of the high points of IAM, including the different options and limitations that this AWS service brings together with its fascinating capabilities.
Many cloud computing users strive to apply security best practices to their cloud computing strategies. One of the best components that Amazon offers to manage security in their cloud computing service is their IAM mechanism, which allows an account owner to create users and manage their permissions within an AWS account.
Although more and more cloud newcomers are grasping the essence of the cloud, the challenges are still great. EU or US “cloud regulations” with regard to security and privacy is still a popular topic of discussion in the cloud social sphere. NIST, a US government research organization, with its cloud program is one of leaders in pushing to define the cloud with its “right rules” supported by relevant standards.
“Cloud computing can and does mean different things to different people. The common characteristics most interpretations share are on-demand scalability of highly available and reliable pooled computing resources, secure access to metered services from nearly anywhere, and displacement of data and services from inside to outside the organization. While aspects of these characteristics have been realized to a certain extent, cloud computing remains a work in progress. This publication provides an overview of the security and privacy challenges pertinent to public cloud computing and points out considerations organizations should take when outsourcing data, applications, and infrastructure to a public cloud environment”
OpenX Source is a free, open-source ad server that allows publishers to manage their ad inventory and set up complex campaigns with sophisticated targeting rules.
Unfortunately, since OpenX has been focusing on their SaaS solution, the open source version is not well supported anymore. Nevertheless, many publishers still use and maintain their OpenX installations as there is a lack of open source alternatives offering the same flexibility and functionality.
Existing documentation to a great extent assumes a single server installation. For high traffic web sites delivering a large volume of advertisement impressions, there is a need for a more scalable setup.
In this article we will describe the steps we have taken to implement OpenX on a multi-server, auto-scaling setup running in AWS cloud and managed via Scalr. A similar approach could, of course, be configured outside the Scalr or even AWS context. We will mention some of the alternatives.
It is alarming that across the board, with the ease of bringing online AWS EC2 resources, users are not aware of the additional impact of their actions. Depending upon usage, security vulnerability could be attributed to several factors. First, simple mis-configurations in a user’s AWS setup can result in room for security flaws. Another, but much more subtle but serious cause could be simple confusion; with so much going on AWS’s usage-based system, it is hard for the human eye to keep track and have visibility to what is going on behind the scenes.
Check out the related cloud security insight - Unnecessary Security Groups Ports are open on DB Server
It is surprising to see that a significant amount of AWS users are aware of the benefits of ELB, yet not familiar with the inherent opportunity to increase availability by using ELB across multiple availability zones. By not doing so, should an outage or a drop in an AWS availability zone service level, the benefit of the ELB is nearly lost. Setting up ELB across multiple availability zones is a relatively straightforward effort that can easily be done if brought to the attention of the user at the time of the ELB configuration.
In AWS, users are charged for allocated Elastic IPs that are not associated to a running instance nor to a network interface (VPC). Therefore, the best practice is to keep only those IP addresses that will be needed in the future. Allocated Elastic IPs you don’t plan to use in the future, or those you just forgot to release, may contribute to unexpectedly high bills.
Newvem tracks the usage of your allocated Elastic IPs and identifies those that haven’t been in use for a significant period. We suggest you consider releasing those allocated IP addresses if you do not plan to use them.