Personal tools

Contact Us 24/7 > 1 866.SIX FEET
Sections

Skip to content. | Skip to navigation

Home > Portfolio > System Deployment
12/07/16

EVERYONE.NET SCHEDULED MAINTENANCE 

Everyone.net will be performing maintenance on their databases Friday, December 9th, 2016 between 9:00PM PT to 3:00AM PT / 12:00AM ET to 06:00AM ET. During this time, all services including web mail, POP, IMAP, and SMTP relay may experience degraded performance and inbound mail delivery delays. We apologize for any inconvenience.

Linux Certification System

Linuxsaltawspython.jpgSix Feet Up contributed to the implementation of the client's Certification Program by developing a mechanism to quickly build and tear down servers. The team successfully integrated the system configuration management tool Salt with Amazon Web Services such as EC2 clusters, S3 storage, SQS and CloudFormation. Scripts were used to automate the management of VM environments and the creation and grading of exams. Leveraging the cross-platform tool Salt proved to be a key success factor as it helped the team navigate through each environment's bugs, quirks and edge cases when trying to get exams to operate identically in each distribution. The Six Feet Up QA team also played a key role in the project by relentlessly running each scenario against the three Linux distributions CentOS, OpenSUSE, and Ubuntu.

See Full Success Story

Online Learning at Penn State

The College of the Liberal Arts at Penn State was running their online courses on an underpowered VM infrastructure and using an inefficient web software architecture. This resulted in multiple performance issues and outages at peak traffic time.

In addition to application improvements, Six Feet Up carried out a technical investigation aimed at identifying the most appropriate hosting setup for the College. The team proposed a new hosting architecture, and specified what size VMs (RAM, CPU cores, etc.) to set up, and which filesystems to use. Finally, Six Feet Up deployed and tested the College's new hosting infrastructure.

The courses are now running on 9 Virtual Machines: 1 for testing, 4 for acceptance and 4 for production. The Plone CMS is running in each environment on a pair of application servers and PostgreSQL is running on a pair of database servers. For better concurrency and scalability, RelStorage is used to store the data on two PostgreSQL database servers. A Netscaler load balancer is used internally and externally to provide scaling and high availability, as well as used as a caching proxy. Memcache is used in the application for caching sessions and frequently-used RelStorage data. Six Feet Up used Pacemaker and Corosync to ensure high availability at the database layer. The team leveraged Locust.io to run the final performance tests.

Since the new system went into effect, the server issues have been entirely resolved and the College of the Liberal Arts has enjoyed a fast and "rock solid" system.

See Full Success Story

University of Virginia Health System

Six Feet Up put a failover system in place to prevent downtime in the case of a server outage. Site data is replicated using RelStorage, a relational database backend for the ZODB. This allows the use of the replication services built in to MySQL to replicate the data across two servers.

Search data is replicated using Solr's built-in replication services.

The application servers utilize the ifstated tool to verify that master and slave services are running and that all services are replicating across both servers. If one server were to stop responding, the second server is able to take over using ifstated, then become the master.

See Full Success Story

Fortune 200 Pharmaceutical Company

Operating in a highly regulated environment, this major pharmaceutical group needed a hosting system that offered superior security without sacrificing power or extensibility.

Six Feet Up provided high availability hosting for this project for over 4 years, with an uptime of 99.97% or higher, and no single point of failure. Six Feet Up designed the final hosting architecture, which consisted of a total of 19 pieces of hardware. The hosting environment was configured by Six Feet Up for failover with redundant Internet connections, private cloud storage of replicated data and full disaster recovery services.

The new hosting infrastructure was running 3 layers:
1. A web front-end layer running Nginx, Varnish and HAproxy
2. An application layer running the open source CMS Plone
3. A database layer using PostgreSQL as well as large binary object storage on ZFS for storage

Six Feet Up set up HAproxy as the load balancer, and Varnish for caching on the front-end.

The system ran out of two geographically separated colocation facilities, and ZFS was used for data replication to the disaster recovery environment. FreeBSD’s CARP ensured failover for both the web front-end and the database layers. Six Feet Up managed its own BGP and routing to handle up to 4 network service providers.

Finally, the team leveraged PGP encryption to manage the verification of thousands of physician records, as well as encrypting backups to ensure the personal details were safe at rest.

See Full Success Story

Next Steps


Select a type of support:

Contact our sales team

First name:
Last name:
Email:
Phone Number:
Message:
Fight spam:
What is + ?
 
Call Us 1 866.SIX FEET
Sections