Setting up an Ubuntu 14.04 LEMP Server with the Nightly Build of PHP7

Install nginx

apt-get update
apt-get install nginx

Install and secure MySQL

apt-get install mysql-server

Add Zend repo and install PHP7

echo "deb ubuntu/" >> /etc/apt/sources.list
apt-get update 
apt-get install php7-nightly

Configure PHP7 php-fpm.conf

cd /usr/local/php7/etc
cp php-fpm.conf.default php-fpm.conf
nano php-fpm.conf


pid = /var/run/


error_log = /var/log/php-fpm.log

Configure PHP7 www.conf

cd /usr/local/php7/etc/php-fpm.d
cp www.conf.default www.conf
nano www.conf


user = www-data


group = www-data


listen.allowed_clients =


security.limit_extensions = .php .php3 .php4 .php5 .php7

Configure PHP7

wget -O /etc/init.d/php7-fpm ""
chmod a+x /etc/init.d/php7-fpm
touch /etc/init/php7-fpm.conf
nano /etc/init/php7-fpm.conf

Add the following to php7-fpm.conf

exec /usr/local/php7/sbin/php-fpm --nodaemonize --fpm-config /usr/local/php7/etc/php-fpm.conf

Add the checkconf file for PHP7

touch /usr/local/lib/php7-fpm-checkconf
    nano /usr/local/lib/php7-fpm-checkconf

Add the following to the php7-fpm-checkconf

set -e
errors=$(/usr/local/php7/sbin/php-fpm --fpm-config /usr/local/php7/etc/php-fpm.conf -t 2>&1 | grep "\[ERROR\]" || $
if [ -n "$errors" ]; then
    echo "Please fix your configuration file..."
    echo $errors
    exit 1
exit 0

Configure PHP7 continued

chmod a+x /usr/local/lib/php7-fpm-checkconf
update-rc.d -f php7-fpm defaults
ln -s /usr/local/php7/bin/php /usr/local/bin/php
ln -s /usr/local/php7/sbin/php-fpm /usr/sbin/php-fpm
service php7-fpm start

Set up nginx config

cd /etc/nginx/sites-available
cp default (your site here).conf
nano (your site here).conf

Configure your config as needed (example script below)

server {
        listen *:80;
        server_name (server name here);

        root /usr/share/nginx/html;
        index index.php index.html index.htm;

        client_max_body_size 1m;

        error_log /var/log/nginx/(your site here).error.log;
        access_log /var/log/nginx/(your site here).access.log;

        location / {
                try_files $uri $uri/ /index.php$is_args$args;

        location ~ \.php$ {
                fastcgi_index index.php;
                fastcgi_split_path_info ^(.+\.php)(/.*)$;
                try_files $uri $uri/ /index.php$is_args$args;
                include /etc/nginx/fastcgi_params;
                fastcgi_param SCRIPT_FILENAME $request_filename;
                fastcgi_param APP_ENV dev;

Continue set up nginx

ln (your site here).conf ../sites-enabled/(your site here).conf
rm ../sites-enabled/default

Restart services

service php7-fpm restart
service nginx restart

ColdFusion Developer

ColdFusion Developer

OmniSpear, Inc. is currently seeking versatile individuals eager to join our team of talented web professionals. Working in a business-to-business environment, you will be responsible for working on large and small web-based applications for our clients.

This is a full time position at our Miamisburg, Ohio office (located near the I-75 Austin Pike Interchange). You must be a U.S. citizen or permanent resident.

Competitive compensation, outstanding health benefits, IRA matching, 14 days PTO, casual work environment, plus other perks.

Required Skills: 
· Bachelor’s Degree or equivalent work experience
· Min 2 year of experience in Cold Fusion Migration
· Solid understanding of object-oriented analysis, design patterns and coding best practices
· Experience with Coldfusion Web Services and Remoting
· Experience in designing and developing scalable and distributed applications (RESTful web services, JMS, integration patterns)
· Experience in Adobe Coldfusion 7-9
· Expereince in MS SQL 2005 to 2008
· Expereince in Web 2.0 Framework: JQuery, prototype

Required experience:

  • ColdFusion: 2 years

Regression testing of websites with just a little JavaScript.

In my past professions whenever the word “testing” came up, there were usually one of two follow-up actions to take on my part as a developer.

– Get with a group of testers and draft a Q&A (quality assurance) specification.
– I would be the one stuck with the task of programming test scripts in some language that would take quite a bit of time (I was using Borland Silk Test version 6 and I’m quite sure the latest is a dramatic improvement).

As I spend most of my time in Javascript, it would be nice to create standard automated regression tests in the same realm of my skill set (i.e. web technologies). And that’s where PhantomJS comes in. For those web developers who have used the Console window inside their browser cranking out console.log statements will be right at home with PhantomJS.

What is it? It’s a headless (something you can’t see) web browser based on WebKit that is “scriptable”. Imagine a console terminal application that can be programmed to go to a web site which performs the following:

– Calculates the time it takes to load the page.
– Performs some validation. When the script enters a letter instead of a integer (inside a input box), does it error like it’s suppose to?
– Automated action. Fill in the input fields and click on the submit button.

If you’re comfortable using jQuery, you can simply load it in the PhantomJS environment and off you go. Here’s a taste of a PhantomJS script that goes to to print out all the blog headings:

var page = require('webpage').create();
page.onConsoleMessage = function(msg) {
};"", function(status) {
    if (status === "success") {
        page.includeJs("", function() {
            page.evaluate(function() {
                var blogs = $('#blog_feed_text');
                $(blogs).find('.wp-title a').each(function(i) {

I’ve been following the development of PhantomJS since it first came out in 2011 and I am excited about where the project is heading. If testing websites is something that is in the back of your mind, you definitely should check it out.

Full-Time Web Developer

OmniSpear, Inc. is currently seeking versatile individuals eager to join our team of talented web professionals. Working in a business-to-business environment, you will be responsible for working on large and small web-based applications for our clients. Projects range from custom websites to custom ERP and CRM systems.

This is a full time position at our Miamisburg, Ohio office (located near the I-75 Austin Pike Interchange). You must be a U.S. citizen or permanent resident.


– Attained or pursuing a Bachelor’s degree in information technology/computer science or equivalent experience
– 2 to 5 years of experience with PHP, C#, Java, Scala, or Ruby
– Show the capability to learn new languages and existing code bases quickly
– Experience with SQL (dialect not important)
– Experience with JavaScript and HTML/CSS
– Clear communication and comprehension skills
– Ability to manage multiple tasks simultaneously
– Experience using source control such as Git


– Develop, support, and maintain web-based applications
– Identify opportunities for application scalability, sustainability & improvement
– Evaluate customer or internally-driven functionality change requests for technical feasibility and level of effort
– Track time spent on projects effectively
– Document and write tests for code
– Follow established development standards for the company and clients

Bonus Skills:

– Experience with Continuous Integration or Automated Deployments
– Experience with Linux web server configuration
– Experience with IIS configuration


– Insurance
– IRA Plan with matching
– Casual work environment

Send resumes to


The Landscape after Penguin and Panda

You probably have heard that Google made major changes to its search engine algorithms in 2013. If you’re wondering what these changes mean to you, you’re not alone.

Search engines have two basic jobs. The first is evaluating and ranking websites based on the quality of their content and the authority they’ve earned from acquiring inbound links. The second is interpreting the intent of the words and phrases in the search bar and returning the most relevant results, placing the highest quality and most authoritative websites at the top. Last year’s overhaul has improved Google’s performance on both.

google_panda_penguinSo what changed last year? The SEO landscape is definitely different. SEO practices recommended by SEO experts in the past won’t work. We’ve read and listened to many posts, tweets and whiteboard presentations by Google insiders and outside experts with real tools to evaluate the results of the changes, often to an overwhelming degree. Much as you would view an impressionist painting from a distance to see the artist’s rendering of its subject, a little bit of time gives perspective to the new SEO landscape created by the changes. Here’s a summary of that landscape.

Google can more effectively identify sites with high quality original content above the fold and reward them with higher page ranks. It can also apply penalties to sites that paste together content from other sources or fill the top of the page with ads and little content. Code named Panda, this part of the algorithm goes beyond scanning meta data descriptions and on-page keywords. Think of it as “seeing” visible content on the page to determine the User Experience and assigning the subject of the page based on that visible content. No more manipulating results with keyword stuffing or writing eloquent meta descriptions with little or no real content for the User to view. Google is looking for good User Experiences…the same as you or I, as users, would expect.

Google also appears on a mission to lower the authority of sites gaining prominence from mostly purchased or self-published links and keyword-stuffed content. Think of this as the blow to those that “bought their way to the top.” An update to this algorithm originating in early 2012, code named Penguin, was implemented in early 2013. Penguin dealt a real blow to pay-to-list directory sites and aggregator spam blog sites. Is Google acting in its own best interest at eliminating these competitive middlemen? Perhaps, but you can’t complain about their efforts to stop this parasitical practice.

In a tweet about Penguin updates in October, Matt Cutts, head of the webspam team at Google, refers us to a webmaster tools blog post that summarizes the SEO landscape after Penguin and Panda well:

“While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.”

It’s early 2014 and rest assured Google is working on algorithm updates. So do we sit and wait for the next round? We can’t… that is, if search results matter to the success of your business. It’s imperative to act now. The current landscape is based on solid principles: focus on the user experience and create high-quality, content-rich sites. Taking any action in either or both of these directions will only improve search engine results and more importantly improve results with Customers.

More next time on the second part of a search engine’s job: interpreting the search and returning relevant results.

Chrome’s Creative Take On LEGO

Chrome and LEGO have teamed up to create a unique and fun building tool, guaranteed to bring the kid and hidden architect out of anyone. With its clear 3D graphics and numerous options of brick sizes and colors, you can spend hours building masterpieces without having to ever tear apart and clean up as with the traditional toy.  The platform itself is very seamless and bursting with animation, allowing the user to retain clear views of their creation from whichever angle the bricks are rotated and stacked upon each other.  The possibilities of building are pretty vast and I must say that even without having ever owned a LEGO set, I found myself hooked.  Take a look at this link to start creating for yourself (please don’t blame me for hours of lost time):

The program even features a builder academy to train the LEGO newbies on the tips and techniques to building like a pro.  As you progress through the lessons, you learn the art of the LEGO, including the hidden gems of the program.  You also have the option to explore other people’s creations from around the states with a map that shows you exactly where that person is located.  Meaning, once you create your build, you can share your project with the world.   It was surprising to see so many detailed models out there, each very different from the other.  I must say my starter build failed in comparison to the pros, but nonetheless, I had a great time testing the software.

legoserverI did also learn throughout my research of this program, that Google has had a longtime connection with LEGO.  Their original storage was even housed in a LEGO unit as a cheap cabinet alternative.  This unit can still be found at Standford University today, and now the Chrome can pay homage to the beloved toy with their own online twist.

Use multiple cloud hosting companies for efficient pricing and effective infrastructure.

While the idea of managing data and applications using multiple cloud providers may sound like a nightmare to maintain – are there any benefits? One likely benefit is the removal of single point of failure within your business process. Cloud providers ranging from Amazon to Microsoft have faced outages in the past and it surely was noticed throughout the web (Instagram website was unavailable).

By putting your business processes in multiple locations and infrastructure – you will find yourself in the following situation: “Our website and applications are down but our reporting service and HR systems are still up.” Doesn’t that sound better compared to “It’s all down. We’re waiting on the hosting company to come back with an ETA.” Perhaps your reporting service is hosted in Microsoft’s Azure (SQL Server and SharePoint rigged with SQL Reporting Services) with handling HR information. Just because your corporate website and product is unavailable doesn’t mean your whole company is at a full stop.

Selecting multiple cloud carriers leaves room for creative price management. Perhaps one provider is better for bandwidth while another is more flexible for storage. If the cost of managing your IT infrastructure is really a concern, selecting multiple cloud hosting company should definitely be considered. And there are solutions out there to help manage the multiple cloud environment, such as RightScale.

Further reading:

2014 Best Cloud Hosting Reviews and Comparisons

Amazon Web Services suffers outage, takes down Vine, Instagram, others with it

Designing Website Asset Caching with Scalatra

Regardless of if you spent countless hours writing pretty CSS and JavaScript or you just need to package Bootstrap and jQuery in your application the last thing you need to worry about is loading all of your assets individually. This can wreck your sites loading times and rendering performance. Scalatra being a light weight framework for building straight forward applications and API’s doesn’t come with a prescribed solution, though WRO4J can step in to assist us with this process. We will step through the configuration of WRO4J, SBT, and finally the inclusion of these into your layouts.

To configure WRO4J we will first add it into our Build.scala or Build.sbt file. We will add a dependency in the following manner:

      libraryDependencies ++= Seq(
        "ro.isdc.wro4j" % "wro4j-core" % "1.4.0",

We then need to configure our wro.xml file for our Boostrap and jQuery assets:

<?xml version="1.0" encoding="UTF-8"?>
<groups xmlns="">
    <group name='core'>
        <css minimize="false">/css/bootstrap.min.css</css>
        <css minimize="false">/css/bootstrap-theme.min.css</css>
        <css minimize="false">/css/bootstrap-glyphicons.css</css>
        <css minimize="true">/css/app.css</css>
        <js minimize="false">/js/jquery.min.js</js>
    <group name='bottom'>
        <js minimize="true">/js/bootstrap.min.js</js>

Followed by our file which defines our WRO4J usage:


Now we will adjust our layouts to point to the new compiled asset:

        <link href="/assets/core.css" rel="stylesheet" />
        <script src="/assets/core.js"></script>

While this is running you should see a local development speed improvement similar to the table below. This result will be amplified when the application is running remotely in a production environment.

To make this capability suit our needs better in development and to additionally be used for Cache Busting in production we can also add a versioning output to our SBT build process. This will create a file that holds version information and a build number from our Jenkins CI, which we will use to determine the build state and to append to the URL to bust the cache.

To create our version.scala add the following code to your Build.scala file:

lazy val project = Project (
      // put this in your build.sbt, this will create a Version.scala file that reflects current build state
      sourceGenerators in Compile <+= (sourceManaged in Compile, name, organization, version) map {
        (sourceManaged: File, name: String, vgp: String, buildVersion) =>
          import java.util.Date
          val file = sourceManaged / vgp.replace(".", "/") / "Version.scala"
          val code =
              if (vgp != null && vgp.nonEmpty) "package " + vgp + "n"
              else ""
              ) +
              "object Version {n" +
              "  val namet= "" + name + ""n" +
              "  val versiont= "" + buildVersion + ""n" +
              "  val datetimet= "" + new Date().toString() + ""n" +
              "  val buildNumbert=""+System.getProperty("BUILD_NUMBER", "MANUAL_BUILD")+""n" +
          IO write(file, code)

We also need to make sure that Scalate will see this class in scope during runtime.

In Build.scala replace:

            Seq.empty,  /* default imports should be added here */


            Seq("import "+Organization+".Version"), /* default imports should be added here */

Now we can update our layout file with a condition that will automatically switch between development and production inclusions of our assets with a Cache Busting scheme.

        <!-- Bootstrap core CSS -->
        <link href="/css/bootstrap.min.css" rel="stylesheet" />
        <script src="//"></script>
        <link href="/assets/core.css?v=<%=com.omnispear.example.assets.Version.buildNumber%>" rel="stylesheet" />
        <script src="/assets/core.js?v=<%=com.omnispear.example.assets.Version.buildNumber%>"></script>

To verify your BUILD_NUMBER handling will work and be passed from Jenkins or your other build system of choice you can run the command example below. Then verify that your assets have switched properly and you are seeing minified and gzipped responses.

./sbt -DBUILD_NUMBER=234 "container:start" "~ ;copy-resources;aux-compile"

An alternative solution would be to use the SBT WRO4J plugin. However, we feel the above method is slightly cleaner and easier to work with for our development process.

The example project for this is available on github. We also would like to thank those whom helped with the initial help with the generation of the Version.scala.

Marketing And Its Relationship To Information Technology

Marketing Meets ITHow often does the marketing department of a firm need to understand and utilize tools of technology today?  The answer is-a lot.  In this day and age, marketers need to not only understand the customer through research and surveying, but they also need to hone in on information in an organized way.  Interpreting data requires many technical skills that often people do not realize. The ability to create graphs electronically, run data queries and utilize a slew of programs to assemble a perfect slide deck, are vital for marketers.  In turn, the marketing goals of the business inherently shape the overall decision makers.

You won’t find a marketing department in many of the larger firms without a CRM, or customer relationship management, software in place.  These programs are often heavily customized during use to extract the most pertinent data collected through research of populations/customer bases, and analyzed for trending.  A marketing professional must truly understand what information they are looking for and know how to get the information they need – i.e. know the software inside and out.  Successful analysis requires core knowledge of a company’s current situation and department-wide goals, as well as some detailed IT-related skills.

Currently, IT departments are working more closely with their marketing arms to understand the customer, as well as their separate channels and ultimate goals. By appreciating these factors, businesses can focus on how to approach customers and the most important products to invest time and energy creating. Organizing a unified goal among departments is key for businesses to prosper, and sharing this important information requires knowledge of technology and its programs.

For a great article detailing the relationship between marketing and IT, please click on the below link:

CSS Frameworks: a Developer’s Best Friend

Most contemporary web developers recognize the importance of using an application framework when developing for the web.  From Ruby on Rails to the litany of PHP frameworks available to up-and-coming Node.js offerings, application frameworks substantially reduce development time while improving code readability and standardizing development practices for the web.  Surprisingly, though, many developers continue to handcraft their CSS for each application, or copy-and-paste legacy layout styles to new projects.  CSS frameworks improve developer workflow for a few key reasons:

1. CSS frameworks standardize layout code.

Handcrafting layout code or maintaining your own company styles is a functional solution, but it reduces readability.  New developers, or developers working on new projects, must learn the terminology and intersectional usage of a number of styles before they can step in to make changes.  Custom stylesheets that have passed through a number of hands often accrue a number of outdated, conflicting, or repetitive styles that further worsen this problem.  Regular refactoring can solve this problem to an extent, but who wants to waste time rewriting old CSS?

CSS frameworks standardize layout code because each class in the framework has the same meaning across applications, in much the same way that each element of an application framework connotes the same functionality across projects.  This keeps stylesheets short and readable: each developer can learn the framework styles once, and only have to use custom styles for application-specific colors and layout elements.

2. CSS frameworks solve cross-browser compatibility issues.

A mature CSS framework, such as Bootstrap or Foundation, will have undergone an extended period of open-source development.  This means that layout issues with old browsers have often been sussed out already, eliminating most of that feeling of shock you get when you first open your site in IE7.  Cross-browser testing is a must for any project, but a good CSS framework will solve many problems for you before you even notice them.

3. CSS frameworks play well with mobile devices.

CSS frameworks are generally built around responsive principles, automatically optimizing content and even menus for the mobile web.  While application-specific tweaks will need to be made, frameworks drastically reduce the time necessary to create a great user experience on mobile platforms.  Additionally, some frameworks, such as Foundation, include layout elements that allow you to control the flow and order of layout elements as they are resized downwards.

Which framework?

The two most common CSS frameworks being used today are Twitter’s Bootstrap and Zurb’s Foundation.  Both offer the advantages mentioned above and neither is a poor choice; however, they have different focal areas that make each one a better choice for different types of projects.  Bootstrap has traditionally been focused more around desktop than mobile, although this has changed to an extend with its most recent release; if battle-tested responsive behavior is important to your project, Foundation is the better choice.  Additionally, a number of developers tend to adhere too closely to stock Bootstrap in their implementations, creating a “Bootstrappy feel” that plagues a number of websites.  Foundation doesn’t tend to create this effect, but that may be just because it’s currently the less popular of the two.  Finally, if your application is in Ruby on Rails, Foundation is the far superior choice; it is implemented with SCSS classes that allow you to easily create mixins and adjust variables to easily access powerful customizability.   Whichever option you select, though, use of a CSS framework will go a long way towards building a powerful, beautiful application with readable, well-maintained styles.