The Landscape after Penguin and Panda

You probably have heard that Google made major changes to its search engine algorithms in 2013. If you’re wondering what these changes mean to you, you’re not alone.

Search engines have two basic jobs. The first is evaluating and ranking websites based on the quality of their content and the authority they’ve earned from acquiring inbound links. The second is interpreting the intent of the words and phrases in the search bar and returning the most relevant results, placing the highest quality and most authoritative websites at the top. Last year’s overhaul has improved Google’s performance on both.

google_panda_penguinSo what changed last year? The SEO landscape is definitely different. SEO practices recommended by SEO experts in the past won’t work. We’ve read and listened to many posts, tweets and whiteboard presentations by Google insiders and outside experts with real tools to evaluate the results of the changes, often to an overwhelming degree. Much as you would view an impressionist painting from a distance to see the artist’s rendering of its subject, a little bit of time gives perspective to the new SEO landscape created by the changes. Here’s a summary of that landscape.

Google can more effectively identify sites with high quality original content above the fold and reward them with higher page ranks. It can also apply penalties to sites that paste together content from other sources or fill the top of the page with ads and little content. Code named Panda, this part of the algorithm goes beyond scanning meta data descriptions and on-page keywords. Think of it as “seeing” visible content on the page to determine the User Experience and assigning the subject of the page based on that visible content. No more manipulating results with keyword stuffing or writing eloquent meta descriptions with little or no real content for the User to view. Google is looking for good User Experiences…the same as you or I, as users, would expect.

Google also appears on a mission to lower the authority of sites gaining prominence from mostly purchased or self-published links and keyword-stuffed content. Think of this as the blow to those that “bought their way to the top.” An update to this algorithm originating in early 2012, code named Penguin, was implemented in early 2013. Penguin dealt a real blow to pay-to-list directory sites and aggregator spam blog sites. Is Google acting in its own best interest at eliminating these competitive middlemen? Perhaps, but you can’t complain about their efforts to stop this parasitical practice.

In a tweet about Penguin updates in October, Matt Cutts, head of the webspam team at Google, refers us to a webmaster tools blog post that summarizes the SEO landscape after Penguin and Panda well:

“While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.”

It’s early 2014 and rest assured Google is working on algorithm updates. So do we sit and wait for the next round? We can’t… that is, if search results matter to the success of your business. It’s imperative to act now. The current landscape is based on solid principles: focus on the user experience and create high-quality, content-rich sites. Taking any action in either or both of these directions will only improve search engine results and more importantly improve results with Customers.

More next time on the second part of a search engine’s job: interpreting the search and returning relevant results.

Chrome’s Creative Take On LEGO

Chrome and LEGO have teamed up to create a unique and fun building tool, guaranteed to bring the kid and hidden architect out of anyone. With its clear 3D graphics and numerous options of brick sizes and colors, you can spend hours building masterpieces without having to ever tear apart and clean up as with the traditional toy.  The platform itself is very seamless and bursting with animation, allowing the user to retain clear views of their creation from whichever angle the bricks are rotated and stacked upon each other.  The possibilities of building are pretty vast and I must say that even without having ever owned a LEGO set, I found myself hooked.  Take a look at this link to start creating for yourself (please don’t blame me for hours of lost time):

http://www.buildwithchrome.com/

The program even features a builder academy to train the LEGO newbies on the tips and techniques to building like a pro.  As you progress through the lessons, you learn the art of the LEGO, including the hidden gems of the program.  You also have the option to explore other people’s creations from around the states with a map that shows you exactly where that person is located.  Meaning, once you create your build, you can share your project with the world.   It was surprising to see so many detailed models out there, each very different from the other.  I must say my starter build failed in comparison to the pros, but nonetheless, I had a great time testing the software.

legoserverI did also learn throughout my research of this program, that Google has had a longtime connection with LEGO.  Their original storage was even housed in a LEGO unit as a cheap cabinet alternative.  This unit can still be found at Standford University today, and now the Chrome can pay homage to the beloved toy with their own online twist.

Use multiple cloud hosting companies for efficient pricing and effective infrastructure.

While the idea of managing data and applications using multiple cloud providers may sound like a nightmare to maintain – are there any benefits? One likely benefit is the removal of single point of failure within your business process. Cloud providers ranging from Amazon to Microsoft have faced outages in the past and it surely was noticed throughout the web (Instagram website was unavailable).

By putting your business processes in multiple locations and infrastructure – you will find yourself in the following situation: “Our website and applications are down but our reporting service and HR systems are still up.” Doesn’t that sound better compared to “It’s all down. We’re waiting on the hosting company to come back with an ETA.” Perhaps your reporting service is hosted in Microsoft’s Azure (SQL Server and SharePoint rigged with SQL Reporting Services) with SalesForce.com handling HR information. Just because your corporate website and product is unavailable doesn’t mean your whole company is at a full stop.

Selecting multiple cloud carriers leaves room for creative price management. Perhaps one provider is better for bandwidth while another is more flexible for storage. If the cost of managing your IT infrastructure is really a concern, selecting multiple cloud hosting company should definitely be considered. And there are solutions out there to help manage the multiple cloud environment, such as RightScale.

Further reading:

Amazon Web Services suffers outage, takes down Vine, Instagram, others with it

Designing Website Asset Caching with Scalatra

Regardless of if you spent countless hours writing pretty CSS and JavaScript or you just need to package Bootstrap and jQuery in your application the last thing you need to worry about is loading all of your assets individually. This can wreck your sites loading times and rendering performance. Scalatra being a light weight framework for building straight forward applications and API’s doesn’t come with a prescribed solution, though WRO4J can step in to assist us with this process. We will step through the configuration of WRO4J, SBT, and finally the inclusion of these into your layouts.

To configure WRO4J we will first add it into our Build.scala or Build.sbt file. We will add a dependency in the following manner:

      libraryDependencies ++= Seq(
        "ro.isdc.wro4j" % "wro4j-core" % "1.4.0",
        ...
      )

We then need to configure our wro.xml file for our Boostrap and jQuery assets:

<?xml version="1.0" encoding="UTF-8"?>
<groups xmlns="http://www.isdc.ro/wro">
    <group name='core'>
        <css minimize="false">/css/bootstrap.min.css</css>
        <css minimize="false">/css/bootstrap-theme.min.css</css>
        <css minimize="false">/css/bootstrap-glyphicons.css</css>
        <css minimize="true">/css/app.css</css>
        <js minimize="false">/js/jquery.min.js</js>
    </group>
    <group name='bottom'>
        <js minimize="true">/js/bootstrap.min.js</js>
    </group>
</groups>

Followed by our wro.properties file which defines our WRO4J usage:

cacheUpdatePeriod=0
modelUpdatePeriod=0
debug=false
disableCache=false
gzipResources=true
ignoreMissingResources=false
jmxEnabled=true
preProcessors=cssImport,semicolonAppender
postProcessors=cssMinJawr,jsMin

Now we will adjust our layouts to point to the new compiled asset:

        <link href="/assets/core.css" rel="stylesheet" />
        <script src="/assets/core.js"></script>

While this is running you should see a local development speed improvement similar to the table below. This result will be amplified when the application is running remotely in a production environment.

To make this capability suit our needs better in development and to additionally be used for Cache Busting in production we can also add a versioning output to our SBT build process. This will create a file that holds version information and a build number from our Jenkins CI, which we will use to determine the build state and to append to the URL to bust the cache.

To create our version.scala add the following code to your Build.scala file:

lazy val project = Project (
    ...
      // put this in your build.sbt, this will create a Version.scala file that reflects current build state
      sourceGenerators in Compile <+= (sourceManaged in Compile, name, organization, version) map {
        (sourceManaged: File, name: String, vgp: String, buildVersion) =>
          import java.util.Date
          val file = sourceManaged / vgp.replace(".", "/") / "Version.scala"
          val code =
            (
              if (vgp != null && vgp.nonEmpty) "package " + vgp + "n"
              else ""
              ) +
              "object Version {n" +
              "  val namet= "" + name + ""n" +
              "  val versiont= "" + buildVersion + ""n" +
              "  val datetimet= "" + new Date().toString() + ""n" +
              "  val buildNumbert=""+System.getProperty("BUILD_NUMBER", "MANUAL_BUILD")+""n" +
              "}n"
          IO write(file, code)
          Seq(file)
      }
    )
)

We also need to make sure that Scalate will see this class in scope during runtime.

In Build.scala replace:

            Seq.empty,  /* default imports should be added here */

With:

            Seq("import "+Organization+".Version"), /* default imports should be added here */

Now we can update our layout file with a condition that will automatically switch between development and production inclusions of our assets with a Cache Busting scheme.

    #if(com.omnispear.example.assets.Version.buildNumber=="MANUAL_BUILD")
        <!-- Bootstrap core CSS -->
        <link href="/css/bootstrap.min.css" rel="stylesheet" />
        <script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
    #else
        <link href="/assets/core.css?v=<%=com.omnispear.example.assets.Version.buildNumber%>" rel="stylesheet" />
        <script src="/assets/core.js?v=<%=com.omnispear.example.assets.Version.buildNumber%>"></script>
    #end

To verify your BUILD_NUMBER handling will work and be passed from Jenkins or your other build system of choice you can run the command example below. Then verify that your assets have switched properly and you are seeing minified and gzipped responses.

./sbt -DBUILD_NUMBER=234 "container:start" "~ ;copy-resources;aux-compile"

An alternative solution would be to use the SBT WRO4J plugin. However, we feel the above method is slightly cleaner and easier to work with for our development process.

The example project for this is available on github. We also would like to thank those whom helped with the initial help with the generation of the Version.scala.

Marketing And Its Relationship To Information Technology

Marketing Meets ITHow often does the marketing department of a firm need to understand and utilize tools of technology today?  The answer is-a lot.  In this day and age, marketers need to not only understand the customer through research and surveying, but they also need to hone in on information in an organized way.  Interpreting data requires many technical skills that often people do not realize. The ability to create graphs electronically, run data queries and utilize a slew of programs to assemble a perfect slide deck, are vital for marketers.  In turn, the marketing goals of the business inherently shape the overall decision makers.

You won’t find a marketing department in many of the larger firms without a CRM, or customer relationship management, software in place.  These programs are often heavily customized during use to extract the most pertinent data collected through research of populations/customer bases, and analyzed for trending.  A marketing professional must truly understand what information they are looking for and know how to get the information they need – i.e. know the software inside and out.  Successful analysis requires core knowledge of a company’s current situation and department-wide goals, as well as some detailed IT-related skills.

Currently, IT departments are working more closely with their marketing arms to understand the customer, as well as their separate channels and ultimate goals. By appreciating these factors, businesses can focus on how to approach customers and the most important products to invest time and energy creating. Organizing a unified goal among departments is key for businesses to prosper, and sharing this important information requires knowledge of technology and its programs.

For a great article detailing the relationship between marketing and IT, please click on the below link:

http://www.computerweekly.com/feature/IT-and-marketing-working-together-for-business-success

CSS Frameworks: a Developer’s Best Friend

Most contemporary web developers recognize the importance of using an application framework when developing for the web.  From Ruby on Rails to the litany of PHP frameworks available to up-and-coming Node.js offerings, application frameworks substantially reduce development time while improving code readability and standardizing development practices for the web.  Surprisingly, though, many developers continue to handcraft their CSS for each application, or copy-and-paste legacy layout styles to new projects.  CSS frameworks improve developer workflow for a few key reasons:

1. CSS frameworks standardize layout code.

Handcrafting layout code or maintaining your own company styles is a functional solution, but it reduces readability.  New developers, or developers working on new projects, must learn the terminology and intersectional usage of a number of styles before they can step in to make changes.  Custom stylesheets that have passed through a number of hands often accrue a number of outdated, conflicting, or repetitive styles that further worsen this problem.  Regular refactoring can solve this problem to an extent, but who wants to waste time rewriting old CSS?

CSS frameworks standardize layout code because each class in the framework has the same meaning across applications, in much the same way that each element of an application framework connotes the same functionality across projects.  This keeps stylesheets short and readable: each developer can learn the framework styles once, and only have to use custom styles for application-specific colors and layout elements.

2. CSS frameworks solve cross-browser compatibility issues.

A mature CSS framework, such as Bootstrap or Foundation, will have undergone an extended period of open-source development.  This means that layout issues with old browsers have often been sussed out already, eliminating most of that feeling of shock you get when you first open your site in IE7.  Cross-browser testing is a must for any project, but a good CSS framework will solve many problems for you before you even notice them.

3. CSS frameworks play well with mobile devices.

CSS frameworks are generally built around responsive principles, automatically optimizing content and even menus for the mobile web.  While application-specific tweaks will need to be made, frameworks drastically reduce the time necessary to create a great user experience on mobile platforms.  Additionally, some frameworks, such as Foundation, include layout elements that allow you to control the flow and order of layout elements as they are resized downwards.

Which framework?

The two most common CSS frameworks being used today are Twitter’s Bootstrap and Zurb’s Foundation.  Both offer the advantages mentioned above and neither is a poor choice; however, they have different focal areas that make each one a better choice for different types of projects.  Bootstrap has traditionally been focused more around desktop than mobile, although this has changed to an extend with its most recent release; if battle-tested responsive behavior is important to your project, Foundation is the better choice.  Additionally, a number of developers tend to adhere too closely to stock Bootstrap in their implementations, creating a “Bootstrappy feel” that plagues a number of websites.  Foundation doesn’t tend to create this effect, but that may be just because it’s currently the less popular of the two.  Finally, if your application is in Ruby on Rails, Foundation is the far superior choice; it is implemented with SCSS classes that allow you to easily create mixins and adjust variables to easily access powerful customizability.   Whichever option you select, though, use of a CSS framework will go a long way towards building a powerful, beautiful application with readable, well-maintained styles.

Standard Development Methodology Improves the Odds: Meeting Clients’ Expectations and Delivering Great Custom Web Applications

Delivering great software applications is imperative to the success of a custom software development shop. Clients turn to a custom shop when off-the-shelf, prepackaged one-size-fits-all solutions can’t meet their business need. They require a unique solution. Delivering a great custom solution means merging the benefits of a standard solution – value, predictability, tried and tested – into a custom software application tailored to the Client’s own business requirements.

A great development team can more successfully deliver great software applications – applications that delight the Client, are delivered on time and on budget and are defect free – with the benefit of our Standard Development Methodology. Even though no two custom software projects will ever have the same spec, the building blocks are generally very similar. Whether you’re building an elegant glass atrium or a solid cinder block warehouse, you need to know the building’s use, put in a foundation, build a solid infrastructure and add exterior finishing touches to realize the architect’s vision and ultimately meet the client’s expectation.

Adhering to a Standard Development Methodology enables our team to more successfully engage the Client resulting in an agreed upon project scope, an effective build plan, realistic due dates and aligned expectations. Our Development Methodology allocates appropriate time to Discovery – gaining insight about the Client and its requirements – and Design – establishing a clear vision for the application. These upfront, pre-coding phases result in the most value for the development dollar.

development process

The truth is, it’s expensive to code. The greatest value in custom development is achieved when code is written to meet the project requirements the first time through. Many might view planning and documenting as time wasted. We believe time spent planning avoids wasted time during Development.

Can Data Mining Be Used For Disease Tracking?

Data mining has become a topic of heavy debate over the years, as many find it intrusive of their personal habits and information.  While many arguments are made on either side of the debate, an interesting new spin on the use of data mining has come to the table.  Epidemiologist Caroline Buckee of the Harvard School of Public Health in Boston, has uniquely forecasted data mining trends to track the spread of malaria across regions near Lake Victoria in Africa.  From her research of data towers, Caroline has been able to interpret cell phone usage information to track the spread of the life-threatening disease.

How was she able to do this simply by studying these cell data figures?  The first step was tracking the largest data tower in the region to study the travel patterns of people to and from areas near Lake Victoria.  What Caroline discovered, is that those people making calls or sending texts from the main tower were traveling 16 times more than others in the region.  Additionally, this same group of individuals were 3 times as likely to travel to Lake Victoria.   With higher exposure to the area of water and surrounding tea plantation, these travelers were prone to getting bites from nearby mosquitos and inadvertently spreading the disease at an increased rate.

big.data_.cheap_.phone_.2x299In hopes of alleviating future epidemics, Caroline plans to not only map out additional affected zones with this knowledge, but to also help prevent the spread by informing the population of nearby danger zones.  The goal is to be able to use  data extracted from cell towers to send preventive messages to travelers in compromised areas via their devices. These alerts can inform people of areas to avoid and safety measures to help contain diseases, such as areas where mosquito netting is a necessity.   We are curious to see what other ways technology can be used in fields and even countries we didn’t think about before.  In a topic that angers most consumers, this is a new spin that is definitely going to change the field.

http://www.technologyreview.com/featuredstory/513721/big-data-from-cheap-phones/

Recovering from CryptoLocker?

CryptoLocker is a form of Ransomware that has been targeting users through phishing emails. It achieves infection by sending phishing emails trying to disguise them as UPS, FedEx, or Xerox. Once infected many users don’t see any signs of infection until CryptoLocker has encrypted a percentage of the documents on their computer and network attached devices.

How can you prevent CryptoLocker?

The best solutions are the tried and true methods of having good up to date antivirus software and spam/virus checking on your email. As a user you should also be wary of anyone you don’t have a regular contact with sending you files. When you have a doubt don’t open them.

Recovering once infected

The only real choice for recovering from the CryptoLocker infection is to restore files from a backup source/service. Finding which files need recovered can be a daunting task, and OmniSpear, Inc. has an answer to assist with that process. OmniSpear has developed an application that tries to match many common file types against their expected file contents. By doing this it can be easier, though not error free, to determine which files and directories should be restored from your backup solution. The scanning tool is freely available at their website.

UI Designer / Developer

Description:

OmniSpear, Inc. is currently seeking creative individuals eager to join our team of talented web professionals. Working in a business-to-business environment, you will be responsible for creatively transforming internal and external concepts into workable designs.

To be considered you must submit samples of your work with your resume. This is a fulltime position at our Miamisburg, Ohio office (located near the I-75 Austin Pike Interchange). You must be a U.S. citizen or permanent resident. Relocation will not be included.

Send all inquiries to careers AT omnispear.com

Requirements:

  • 2 to 5 years of experience designing for web based platforms, interactive design, or similar
  • Associates degree or higher in interactive design, multimedia design, or related field with applicable experience
  • Excellent visual design skills, including an eye for typography, icon design, composition, and use of color
  • Practical experience in development of HTML/CSS, JavaScript, graphics creation with Adobe Creative Suite
  • Experience working in an agile environment
  • Solid understanding of effective UI patterns and best practices
  • Knowledge of design research and usability testing methods
  • Excellent written and verbal communication skills

Responsibilities:

  • 50% Designing websites and web based applications
  • 50% Implementing designs in cooperation with developers
  • Work closely with sales, engineering, and various stakeholders to design intuitive, and functional user experiences
  • Translate concepts and ideas into workable designs
  • Develop and maintain mockups, visual assets, wireframes, prototypes

Bonus:

  • Development experience in PHP, Java, Ruby, etc
  • Experience using source control such as Git

Perks:

  • Insurance
  • IRA Plan with matching
  • Unlimited beverages
  • Casual work environment

Company Description:

Founded in 2001, OmniSpear, Inc. services clients in Dayton, Cincinnati, and beyond. We focus on helping our clients improve their processes, infrastructure, and appearance through our extensive knowledge of technology. Joining OmniSpear means you will get to work in an open and agile environment that supports creativity and new ideas.