Monday, August 27, 2012

AppSec USA – Register Now!

AppSec USA – Register Now!:


What:       AppSec USA 2012
Where:     Austin, TX
When:      October 23-26, 2012
Website:   www.appsecusa.org

AppSec conferences bring together industry, government, security researchers, and practitioners to discuss the state of the art in application security. The conference features talks and sessions in the Application Security space including development, web application security, cloud security, DevOps, Open Source and OWASP tools given by the top speakers in the industry.


AppSec USA 2012 Training:
One-Day Training: Tuesday, October 23
One-Day Trainings: Wednesday, October 24
Two-Day Trainings: Tuesday-Wednesday  (October 23-24)


AppSec USA 2012 Keynote Speakers:
  • Douglas Crockford (JavaScript Developer and Inventor of JSON)
  • Michael Howard (Sr. Security Program Manager at Microsoft)
  • Gene Kim (Researcher, TripWire Founder)


AppSec USA 2012 Speakers:
  • HD Moore
  • Richard Bejtlich
  • Michael Coates
  • Josh Corman
  • Brendan Eich
  • Chris Evans
  • Jeremiah Grossman
  • Phillip Hallam Baker
  • David Kennedy
  • Bob Lord
  • Adam Mein
  • Chris Nickerson
  • Alex Rice
  •  Alex Russell
  • Amichai Shulman


Please visit www.appsecusa.org to learn more or register for the event. 

Also, AppSecUSA has a special rate of $189/night with the conference hotel.  Reservations at this rate end on September 25th and the Hyatt will not hold rooms for OWASP meaning that if you reserve too late, you will not be able to stay at the conference hotel.  Don’t let that happen to you, reserve your room now.


We look forward to seeing you in Austin this October!

Best Regards,
The AppSec USA 2012 Planning Team

Sunday, August 26, 2012

Santoku Linux Mobile Forensic & Security Distribution

https://santoku-linux.com/ 



Santoku is a platform for mobile
forensics, mobile malware analysis and mobile application security
assessment. The free Santoku Community Edition is a collaborative
project to provide a pre-configured Linux environment with utilities,
drivers and guides for these areas. The alpha release is based on a
fork of the OWASPMobiSec distro.



The word santoku loosely translates as ‘three
virtues’ or ‘three uses’. Santoku Linux has been crafted to
support you in three endeavours:


Santoku Community Edition is a pre-configured, bootable Linux environment. It can be run in Virtual Box or VMWare Player which are available free and run on Linux, Mac or Windows. The download is large (3+ GB) because it is a full .iso which contains a variety of packages, drivers, and applications. We strongly recommend you download on a fast connection with plenty of time (e.g. overnight).

Mobile Forensics

Tools to acquire and analyze data
  • Firmware flashing tools for
    multiple manufacturers
  • Imaging tools for NAND, media
    cards, and RAM
  • Free versions of some commercial
    forensics tools
  • Useful scripts and utilities specifically designed for mobile
    forensics

Mobile Malware

Tools for examining mobile malware
  • Mobile device emulators
  • Utilities to simulate network
    services for dynamic analysis
  • Decompilation and disassembly
    tools
  • Access to malware databases

Mobile Security

Assessment of mobile apps
  • Decompilation and disassembly
    tools
  • Scripts to detect common issues in
    mobile applications
  • Scripts to automate decrypting binaries, deploying apps,
    enumerating app details, and more

Development Tools:
  • Android SDK Manager
  • BlackBerry JDE
  • BlackBerry Tablet OS SDK
  • BlackBerry WebWorks
  • DroidBox
  • Eclipse IDE
  • Windows Phone SDK
  • Android 2.3.3, 3.2, and 4.0.3 Emulators
  • SecurityCompass Lab Server (HTTP and HTTPS)
  • BlackBerry Ripple
  • BlackBerry Simulators
Penetration Testing:

  • CeWL
  • DirBuster
  • Fierce
  • Nikto
  • nmap
  • Burp Suite
  • Mallory
  • w3af Console
  • w3af GUI
  • ZAP
  • BeEF
  • Ettercap
  • iSniff
  • Metasploit Console
  • Metasploit GUI
  • NetSed
  • SET
  • SQLMap
  • SSLStrip
Reverse Engineering:

  • APK Tool
  • Dex2Jar
  • Flawfinder
  • Java Decompiler
  • Strace
Wireless Analyzers:

  • Aircrack-ng
  • Kismet
  • Ubertooth Kismet
  • Ubertooth Spectrum Analyzer
  • Wireshark
Device Forensics:

  • AFLogical Open Source Edition
  • Android Encryption Brute Force
  • BitPim
  • BlackBerry Desktop Manager
  • Foremost
  • iPhone Backup Analyzer
  • MIAT
  • Paraben Device Seizure
  • Sift Workstation
  • Sleuth Kit
  • SQLiteSpy
Mobile Infrastructure:

  • BES Express
  • Google Mobile Management
  • iPhone Configuration Tool

Saturday, August 25, 2012

From the Trenches – When to Break the 3-Minute Rule with Videos

From the Trenches – When to Break the 3-Minute Rule with Videos:


I’ve been using videos for our dealership since 2009 when I saw a seminar by Jim Ziegler at NADA.  I was very impressed and went back to Chicago and bought a video camera.  Since then, we’ve put up over 2,700 videos on YouTube.

The best practice of 3 to 5 minutes for a video is well known and has a firm foundation.  But it’s not gospel.  Let me explain.

In my mind, for dealers, there are basically four kinds of videos that you can make for your dealership and varying lengths that are acceptable for them.

  • Branding videos – very short, maybe 1 minute max.
  • Conversion videos – 2-5 minutes
  • True Walkaround Videos – 3-8 minutes
  • Instructional Videos – whatever is necessary



Branding Videos
are basically advertising your store or product and services.  Customers have a very low tolerance and acceptance for them because they are “push” marketing similar to TV commercials.  Need I say more?  These should be very polished and you probably want a professional involved.

Conversion videos are videos created and sent to customers by a salesperson on a specific vehicle because "a video is worth a thousand photos".  It allows the salesperson to introduce him or herself, plug the store, and ask for the appointment.  The customer wants to see that car and is willing to watch a little longer.

True Walkaround Videos are not directed at a particular customer.  They are intended to be useful to those researching a particular model.  Consumers who are researching want to see as much as possible about a particular vehicle so they can compare it to competing models.  They will watch the entire video if it provides what they are looking for.

Instructional Videos teach something and the complexity of doing so can cause varying lengths of time.  Consumers understand that.  Something simple can be taught very quickly but some things take longer just to get through the steps.  The consumer will follow along based on their interest.

There are exceptions to every rule.  The real key is whether you are providing the entertainment factor or are providing the information that is being sought by the consumer.  A customer that is looking for detail will not appreciate it if you skip over those details to keep your video short.

Best practices are there to be a guideline, not a rule.  Always practice being a consumer and that will tell you when to go beyond best pratices.
Visit with me at AutoCon2012 in September!



Written by
 Tom Gorham
Editor, From The Trenches
Automotive Digital Marketing

Professional Community

Emulating Microsoft’s Metro Design Language

Emulating Microsoft’s Metro Design Language:

Over the past few years, Microsoft has adopted its incumbent design language to a significant extent. Metro is the aesthetic basis of Windows 8; Microsoft’s next operating system shipping this October. Let’s take a look at what Metro is, how we can emulate some of its desirable principles and take a look at where it’s being used already.

What is Metro?

Metro is the name given to the design language used in Microsoft’s current and next-generation operating systems, including the upcoming Windows 8, the current Xbox 360 dashboard and in some of their websites. Aspects have already been evident in some of the company’s earlier work, back in Windows XP and the Zune.
Microsoft’s design team have revealed that the language is partly influenced by public transport signs which places a significant emphasis on typography and a visual hierarchy consisting of text with varying properties. In software desgn, Microsoft described Metro as a “refresh” from the interfaces of Windows (pre-8), Android and iOS which feature primarily icon-based interfaces.
Although the sharp transition seen in Windows 8 has been open to controversy, Metro’s reception has been chiefly positive, and it’s not difficult to see why.
To Microsoft’s credit, the aesthetic of [the Metro UI] is a bit more daring and informal than the tight, sterile icon grids and Rolodex menus of the iPhone and iPod Touch. – CNET
Microsoft highlight four principles as being fundamental to the Metro UI; typography, motion, “content not chrome” and honesty. Let’s take a look at how these translate to the web.

Typography

Typography is a very key principle in Metro. As a user interface, Metro swaps out traditional icon-based design for navigation which is comprised solely of text. When said text needs to conform to a defined grid, it’s placed on quadrilaterals of solid color.
Metro is described as being modern and that is evident through the use of sans-serif type, specifically, in the case of Windows, the Segoe family of fonts. In Metro-influenced design, text is largely differentiated by nothing more than size, keeping typeface, weight and other properties the same or similar, differentiating in other properties only when acting as a link or on an alternative background color.
With the reduction of graphics in favour of text, Metro influences the use of text at all levels. A hierarchy is therefore born and on a minimalist site design, type size can act as the only distinction between sections (say, a blog post title and accompanying text). Marrying few disparities between text styles with a good, consistent use of whitespace is a core part of the typographical principles in Metro.
Microsoft themselves have an article on Metro’s typography guidelines. While it’s obvious you don’t have to comply with them (it’s mainly a guideline for native software developers) the guide does demonstrate one way to use Metro’s typographical ethos at various levels.

Content, Not Chrome

In conjunction with the typographic features of Metro, “content, not chrome” plays a huge part in the visual distinction of Metro to other ways of designing.
The visible graphical interface features of an application are sometimes referred to as “chrome” – Wikipedia
Metro’s minimalist approach is ultimately met by avoiding interfaces with chrome. Instead, you’ll encounter content-centric designs which consist chiefly of text observing the aforementioned typographical qualities. By removing the chrome, designs push content as the main focus, particularly advantageous in blogs and other text-based sites.
By avoiding chrome in the design, a site can benefit from more seamless scaling and work effectively at smaller screen sizes, as part of a responsive design. You can observe an example of this by looking at Metro apps in Windows 8, juxtaposed with comparable ones on Windows Phone 7. The “content, not chrome” philosophy plays into a responsive design that’s obviously important when designing to be consumed at multiple sizes.
In Google Chrome, we strove to eliminate [Chrome] – not just because it leads to a simpler, cleaner design, but because we felt that your web applications should not appear to be constrained within the bulky cruft of a browser – they should feel like first-class applications on your desktop. – Google.
Microsoft’s main belief in “content, not chrome” is to delight with content, not decoration, achieved by reducing anything on the page that is not content. By doing so, you can achieve a pleasingly simple user experience, a philosophy that has been evidenced many designs on the web already, even if they don’t look quite like Microsoft’s Metro. Even browsers are implementing software to give users a viewing option that adheres to the principle.

Motion and Scrolling

The Motion aspect of Metro as a design language is something focused for applications, and its influence on your design is dependent on how much interaction a user has and the level of transition that interaction invokes.
Microsoft believes, through Metro, that the aesthetic of a design is matched in importance by the fluidity and performance of the design, and that the motion principles provide depth and responsiveness to a site.
Windows 8 and Windows Phone 7 also have a fondness for horizontal scrolling, with content laid out in a direction that’s virtually non-existent in a more traditional design paradigm. It hasn’t really been translated to the desktop web, even in Microsoft’s own sites, but is quite popular in mobile design. Therefore, responsive or touch-centric designs can utilize this pretty effectively. (Although, while this is used in Metro-based UIs, it’s not an official principle and is already evident across operating systems.)

Authentically Digital

Metro is “authentically digital”, a contrast to some of the more skeuomorphic design principles of companies like Apple. For example, Apple designed its Contacts, Calendar, Notes and Reminders apps to resemble their more physical counterparts, something which has proved pretty controversial.
Microsoft believes in something very different. The Metro language is designed so its principles don’t try to resemble something that it’s not, and instead embrace its digital nature through design. So, instead of making a list of contacts look like the address book you might buy in a store, it should look like a list of contacts on a phone or computer to comply with the principles in Metro.
The logo for Windows 8 was met with controversy of its own, but it plays into Microsoft’s principles of being authentically digital. Sure, it still looks like a window, but not like the window that’s part of a building – instead, simply the window in the name – so there’s no faux glass or wood in it. Even your design of a button should take no influence from the design of a button you encounter in real life.
Choosing not to emulate a traditional to-do list has allowed the designers [of Clear, for iPhone] to think differently and make full use of the technology available to them. The app has been built around multi-touch gestures and along with smooth animation, it makes for a very unique and elegant experience. – Shaun Cronin
This concept plays its part into the “content, not chrome” idea by removing superfluous detail that does nothing to support content.

A Philosophy, Not A Design

Importantly, Metro is not one design. While there are definitely sites out there built to directly resemble Windows 8 or Windows Phone 7, that’s just one interpretation of the language. Your design doesn’t need to look identical to a Metro Windows app to conform to the language.
Designs can take inspiration from Metro at varying levels, but obviously don’t necessarily have to look like they were manufactured by Microsoft. You can implement the typographical principles of Metro without needing to use Segoe as your typeface family of choice, much like you can implement “content, not chrome”.
Actually, with typography playing such an integral part of Metro, your design can end up looking nothing like Microsoft’s whilst still taking advantage of its practices.

Roundup: Inspired by Metro

Metro was engineered primarily for software design, but its features have begun to influence the web too, starting with Microsoft’s own properties.

Microsoft.com Preview

The preview of Microsoft.com is a very nice refresh of the company’s homepage to really align itself with Metro and give it a responsive update.

My Kind of Phone

My Kind of Phone is the UK blog of Windows Phone, so it’s pretty natural that it should follow the design language of the platform it covers. You can also see Metro’s “Motion” principles in play here, with some content transitioning in as you hover over items.

Zune

Microsoft’s Zune site is very heavily influenced by Metro, especially notable since the Zune was the first device to really embrace a full-blown Metro interface. You can see in the design a perfect embodiment of Metro on the web.

BBC

The BBC’s current homepage adheres heavily to Metro principles with a bold use of typography and grids.

Google / Gmail

Even Google’s websites have a spark of Metro, with the most recent redesign of the app suite featuring a generous use of whitespace and use of typography. Much like in Microsoft’s guidelines for typography, Google uses a single accent color in the midst of its simple typography.

Web Design Workshop #19: A.M Motors

Web Design Workshop #19: A.M Motors:

Web Design Workshop is our regular community project where we ask readers to submit their work for your friendly, constructive criticism. It’s the perfect way to learn, offer opinions and have your own work critiqued! This week, something for the motorheads amongst you..


Rules of Engagement

Play nice! We deliberately select work which will benefit from advice and pointers. If you can’t be constructive in your comments, don’t. Other than that, offer any advice you can give. Feel free to link to examples and images which back up your points.

The Design

Web Design Workshop #19
We designed this to showcase all the cars manufactured by Maruti Suzuki in India. This is a dealer website. We are trying to make all the info available when a person needs to buy a car. – Jaijin Poulose

Looking for constructive criticism on your own work? Submit it for a workshop – most but not all submissions are published. Be patient though, there could be a queue..

Thursday, August 23, 2012

"Stand Your Ground" - A Case for GRC

"Stand Your Ground" - A Case for GRC:
If you've not had the opportunity to read the recent Dan Geer / Jerry Archer IEEE S&P Cleartext column titled "Stand Your Ground," then please go read it now. It's only a single page, two-column article and it won't take you long. It is, hands-down, one of the best summaries of contemporary, leading-edge thinking on the state of infosec that I've seen.

Finished? Cool... let's continue...
Allow me to take an outlandish step and try to summarize this excellent, concise work in three bullets. I'll then try to elaborate a bit on each of these points to put things into a practicable use case.

In short, the article points out the key business imperative: survival. In order for a business to succeed and have value, it has to survive over time, enduring ups and downs, and especially being able to handle various IT-related incidents. In order to achieve this objective, Geer and Archer roughly cover three things:


1. Radically Reduce Attack Surface

2. Design For Resilience and Palatable Failure Modes

3. Automate For Manageability


Allow me to elaborate a bit on each of these points.

Radically Reduce Attack Surface

At the ISSA International Conference 2011 in Baltimore, then-FBI executive assistant director Shawn Henry not only advocated companies reducing their online presence, but also that companies should consider whether or not certain types of records should even be electronic rather than remaining in a more traditional physical form. I scoffed at the notion back then, but am now starting to come around to the idea, if ever so slightly.

Part of the reason I've started to come around is because of things like this story in Business Week that talks about malware targeting the private networks of airlines, potentially exfiltrating data or even manipulating flight and reservation information. Of course, we've heard similar horror stories about SCADA and ICS systems in the energy sector, where 10+ years ago these companies would turn a blind eye to infosec advice because these systems weren't internet-connected, but today almost brag about how their operators can access and manage these systems from anywhere in the world. Kind of missing the point, eh?

In the article, the authors comment that "expansion of the enterprise's catalog of essential technologies creates unfunded liabilities," which exactly translates to an increased attack surface. Put another way, the more we rely on technology, the more likely we are to experience a very bad incident when one of those technologies becomes compromised or unavailable.

Now, some of you may be cringing at the use of the phrase "attack surface," and understandably so. Really, when talking about "attack surface," I'm implying a few things. First, it means reducing the threat and vulnerability profiles of online systems. This objective can be achieved in part through traditional patching and hardening practices, as well as by working to not draw attention to assets (e.g., don't volunteer yourself as a ready target in Washington Post or New York Times;). Second, it means restricting what systems and data are actually online and accessible. Third - and this is quickly becoming most difficult - is aggressively working with employees to help reduce what they leak in and out of the organization. BYOD policies inevitably mean allowing in hundreds of relatively insecure, unmonitored devices that can expose the enterprise to a wide array of badness, not to mention that you're also then letting people walk out with devices that have more processing power and storage than an entire office of PCs did 20 years ago.

If an asset is of high value to the organization, then it's time to evaluate what level of exposure is acceptable - and, by extension, what level of liability and loss is acceptable - and tune policies and practices accordingly. Oh, and make sure these decisions are not made in a vacuum. These are business decisions that must be made by business leaders with input from IT, security, and risk personnel.

Design For Resilience and Palatable Failure Modes

So... you've taken the first step and removed what you could from harm's way... sadly, it's probably not nearly as much as you'd like (as a security or risk management person), but at least it's a start... now what do you do?

As part of a survivability strategy, your organization must conduct risk management planning from the perspective of assuming incidents will occur. As the NSA has done, it's time to also assume that the line between "inside" and "outside" is at best thin, and at most likely merely representative and not truly restrictive. Also, you must assume that your organization will be caught-up as part of the collateral damage (or perhaps directly targeted, depending on what your business does) in any number of online "exchanges" between government forces, spies, protestors, anarchists, etc. (pick a name/label, it'll generally fit). The point here is this: you'll never stop all potential incidents, and thus must plan for managing these incidents effectively and efficiently.

There are a few considerations around this topic... first, assuming failures will occur, you must have contingency plans (e.g., incident response and business continuity plans) in place, with people adequately trained, and procedures adequately tested and drilled. Second, when planning, designing, and strategizing, it's imperative to consider infrastructure and data resilience. That is, how well can your business continue to function despite degraded conditions? Then, lastly, it's best to attempt to build into your operations "acceptable" failure modes. Specifically, when designing and architecting solutions, evaluating ways in which the solution can fail, and help plan those failure modes as best as possible. Not only will this approach help you identify worthwhile defensive trade-offs (risk decisions), but it will also show you how your environment can break and, as a result, how you can build safety nets around those failure points to help dampen the net-negative impact. Also, don't forget to loop back through the first point on reducing attack surface as part of these design and architecture discussions, since the easiest way to avoid a data breach is to never expose it in the first place.

Automate For Manageability

Last, but not least, is the closing paragraph of the article, which advocates automating as much as possible to account for ever-changing compliance requirements, to reduce first-level support overhead, and to improve the overall efficiency of other efforts. In a nutshell, I view this paragraph - and much of the article - as a strong use case for GRC solutions (caveat: I'm a wee bit biased).

As I see it, there are a few ways where GRC solutions help automate various duties in the enterprise, and as a result reduce some overhead burdens:

* Managing Compliance Requirements (w/ UCF) A good GRC solution will provide a means to effectively map external compliance objectives and internally-designated requirements (such as audit/certification readiness) to a standard controls framework (like the Unified Compliance Framework), and further through to the policy framework. Organizations should be managing their own custom controls framework, and not simply acting ad hoc in response to each individual set of requirements derived from any number of standards and regulations.

* Providing a Self-Service Policy/Requirements Interface In addition to mapping requirements to controls to policy documents, it is also then vital that this information be made available to users, and that users be given nominal training on how to access and make use of this information. Building concise, clear policies that articulate requirements and link them to their source authorities will help users understand the "why" of a given policy, which will in turn increase the likelihood that they will want to conform, and will in fact adhere to policies.

* Automating Various Processes Many processes can be automated using tools like GRC solutions. For example, policy authoring, review, and approval processes, or even routine periodic reviews and revisions. Automated processes can also be used to track and follow-up on exception requests, various risk factors within a risk register, remediation activities around vulnerability scan or pentest findings, or exercise of business continuity and disaster recovery plans.

* Centralizing and Coordinating Technical Security Data (addressing "big data") Much of the "big data" problem revolves not around performing analysis in a given silo, but in being able to coordinate analyses across silos. GRC solutions provide a convenient means to bridge silos to allow management, executives, and risk managers to gain a broader, more complete view of operations and the business, and to thus make better-informed decisions.

* Managing Various Requests and Reports How many organizations have multiple methods for capturing requests or types of reports? GRC solutions today provide a central point through which these requests and reports can be gathered, processed, and tied to other related data (e.g., incident and post-mortem reports tied to vuln scan and pentest data, as well as configuration requirements that may not have been followed/met). Again, the underlying principal here is automating collection, tracking, and management of this data, as well as encouraging better self-service for users, which in turn improves participation and conformance to desired behavioral norms.

There's much more that GRC solutions can do to help, too, but this hopefully give you an idea of the legitimate role that GRC can play for businesses. Now, obviously, GRC solutions are no panacea, and still require having smart people in your organization to help set things up, manage them on an ongoing basis, and to provide that sentient human interrupt that automated processes occasionally need. Overall, though, GRC can help reduce some of the ongoing overhead costs typically associated with topics like policy frameworks, control frameworks, managing compliance objectives, preparing for routine or major audits (e.g., SSAE-16 readiness, ISO 27001 readiness, PCI compliance). And, perhaps more importantly, these solutions provide users with a self-service interface where they can quickly answer policy and requirement questions, while also identifying the provenance of requirements, such as through seeing how they tie back to audit or compliance objectives.

In Closing...

At the end of the day, all three of these points should represent a critical focus for a security or GRC team. In particular, when worked together, the net effect is to reduce the overall risk profile facing the organization, which can have tremendous positive benefits. At the same time, it also helps the business better articulate survival objectives and gain a better understanding how just what assets (i.e., people/process/technology) are truly critical to ongoing operations.

Why XSS is serious business (and why Tesco needs to pay attention)

Why XSS is serious business (and why Tesco needs to pay attention):
It was three weeks ago now that I wrote about Lessons in website security anti-patterns by Tesco where I pointed out a whole raft of basic, flawed practices which jeopardised the security and privacy of shoppers. These practices in and of themselves were (are) bad, but what really seemed to fire up a lot of people was Tesco’s response when I first flagged it with them:
Twitter: @troyhunt Passwords are stored in a secure way. They’re only copied into plain text when pasted automatically into a password reminder mail.
1,883 retweets later, numerous media articles and a chorus of software and security professionals decrying Tesco’s approach to security (and customer service, for that matter) including one of the industry’s most preeminent security brains referring to their password security as “lousy”, and nothing has changed. One of the things that hasn’t changed is their continued assertion that there’s nothing to see here – no security problems, move along now“
Continued assertion that Tesco security is "robust"

This was just last week and well after the “robust” theory had been well and truly rejected by a great number of people who actually know what robust security looks like (or least know what it doesn't look like). Of course this same canned response was used many, many times to people raising legitimate concerns so I’m sure it’s nothing against Craig in particular.
But Tesco have security problems that go beyond what I originally wrote about, far beyond. At the time, I speculated that this was reminiscent of the Billabong situation where there were numerous bad security practices easily observable in addition to the ones which allowed an attacker to steal 21,000 account details. So it was no surprise when people started commenting about additional vulnerabilities such as SQL injection (unverified) or personally emailing me details of risks such as cross site scripting (verified). It was also no surprise to hear that many people had raised security concerns with Tesco over many years but to no avail. After all, they’d “never been hacked” (their quote) so things must be robust, right?
Let me get to the point of this post; I had someone from Tesco connect with me privately after which I asked for some technical contacts I could pass on information to. I’m not going to publicly divulge any names or positions publicly, but what I can say is that I was given the details of multiple people in senior technology roles after which I then passed on (anonymised) details of the cross site scripting (XSS) risk that was sent to me. That was two and a half weeks ago so nearly a week later, after receiving no response, I followed up on the original message. Nothing. Nada. Zip. And the vulnerability is still there.

What you need to know about XSS

At this point I want to stop talking about Tesco specifically and turn this around to a more constructive post; things you need to know about XSS. I’m going to approach this a little differently and provide a video as it requires a bit of social engineering which is best illustrated in real time. A number of the risks you see in this video are present on Tesco’s website but equally, some of them are not. Clearly I’m not going to disclose any detail which will allow someone to go off and get involved in malicious activity, but what I am going to do is show how XSS can be used to cause serious trouble and that’s what I hope Tesco (and others) can begin to appreciate.
Just on the browser compatibly for that XSS: IE9 and IE10 are actually pretty good and will warn you about it without exexuting it. All other browsers tested – Chrome, Firefox and Safari (desktop and iOS) – will happily parse it and allow the exploit to occur.

Summary

As I said earlier, not all the risks I’ve highlighted in the video may be present on Tesco’s website and I’ve intentionally avoided reproducing the specific circumstances where they are vulnerable. Some would argue that sufficient time has passed since responsible disclosure to publish details, but I don’t think that is necessary regardless of how frivolous Tesco believe the risk is. But certainly there are enough of these risks to be very concerned though, definitely more concerned than leaving it unfixed for days let alone weeks.
Interestingly, it seems that Tesco’s rather unique approach to security is now coming under scrutiny from the Information Commissioners Office in the UK. Whilst a statement such as "We are aware of this issue and will be making inquiries" is far from a damning indictment, it will be interesting to see how this unfolds and whether the company may actually be called on those “lousy” practices.
To wrap up on the general XSS discussion, there are really two key things which absolutely, positively have to happen but often don’t:
  1. All input must be validated against a whitelist of acceptable value ranges. In fact this is so important it appears in big, bold letters on my post about OWASP Top 10 for .NET developers part 2: Cross-Site Scripting (XSS)
  2. All output must be encoded for the context in which it is emitted. HTML, HTML attributes, CSS, JavaScript, XML, XML attributes etc. Not getting the context correct or even worse, not encoding at all can be disastrous.
Other lax security practices can then be combined with XSS to make the site more readily exploitable; cookies which aren’t flagged as HTTP only, for example. Plus of course there’s good old social engineering to help exploit reflected XSS.
Then there’s the fact that in terms of web app vulnerabilities, XSS gets very, very messy. How? Well to begin with different browsers implement different levels of defence. For example IE10 protects against some of those earlier examples where latest gen Chrome and Firefox don’t. Then there’s the fact that XSS is only roughly related to semantically correct HTML; browsers frequently allow markup to be parsed which shouldn’t be, for example IE6 will happily parse a tag like <IMG SRC='vbscript:msgbox("XSS")'> which contains VB Script. The XSS Cheat Sheet will give you a sense of just how obscure things can get.
As always, we’re back to security in layers. One particular flaw may not appear too serious but combined with other risks, things can get very nasty very quickly. Mitigating XSS on the vast majority of websites is simple – I mean really simple, at least for things like whitelisting and output encoding. The website used in the video above was easily made insecure just as it is easily made secure and you download the whole thing from GitHub (including comments on how to secure it) from the resources below.

Resources

  1. The codebase for the “robust” site used in the video is publicly accessible over on GitHub under ShopWithRobustSecurity
  2. You can still see it in action at ShopWithRobustSecurity.apphb.com
  3. The site which was then used to harvest cookies and credentials is also on GitHub under EvilShopliftingSite
  4. WhiteHat Security’s report referring to XSS being the most prevalent risk is here
  5. Details on Samy’s MySpace XSS worm are on the SecurityFocus website

Wednesday, August 22, 2012

Using FTP and FileZilla


Quick Guide

This guide gives you a short overview on how to use FileZilla client. By default you don't have to configure FileZilla, so you can start directly working with the program.


Download Filezilla from here  http://filezilla-project.org/download.php

Connecting to an FTP server

Using the Quick Connect bar

To connect to an FTP server, enter the address of the server into the host field of the Quickconnect bar (e.g. example.com - see image below). If it is a special server type like an SFTP server, add the protocol in front of the address. In case of an SFTP server, start the address with 'sftp://' (e.g. sftp://example.com). Enter the port of the server into the port field if it is not the default port (21 for FTP, 22 for SFTP). If a username / password is required, enter it in the corresponding fields, otherwise the default anonymous logon will be used. Click on Quickconnect or press Enter to connect to the server.
Enter-info.png
Please notice that the Quick Connect is for... quick connections - so there is no way to edit the quick connections list which stores the last 10 entries. To store FTP server names you should use the Site Manager, instead.
Quick Connect is good for testing the login info before making a site manager entry. Once you connect, you can choose File -> "Copy current connection to Site Manager..." to make a permanent entry. It is usually best to check your login info with Quick Connect before making a permanent entry.

Using Site Manager

You can use the FileZilla Site Manager to specify specific site parameters and connect to the targeted FTP site. The Site Manager allows you to store entries and configure more parameters than the Quick Connect allows.

Special case: Servers in LAN

If the server you are connecting to is in your home LAN, then there is not an address (domain name) as you may know from the Internet. In LAN, you simply use the internal IP address of the server PC. In most cases the network name (computer name) will do, too. If the server is on the same PC as Filezilla, you can even use localhost or 127.0.0.1 as hostname.
NOTE: If you connect from the outside of your LAN this does not apply, you have to use the external (WAN) IP instead.

Navigating on the server

After a successful connection attempt, a list of files and directories appears on the right side of the main window. The name of the current remote directory is listed in the edit field on the top. Below that you will see the remote directory tree. Under the remote directory is a list of the contents of the current remote directory.
To change the current remote directory:
  • Type a directory name into the edit field and press enter, or
  • Click a directory in the directory tree, or
  • Double-click a directory in the list of the current directory contents
You will notice a directory called ".." listed in virtually all directories. Selecting this directory allows you to go up to the parent directory of the current directory.
Question marks ("?") appear on directories you haven't accessed yet, indicating that the FileZilla Client can't tell if there are subdirectories within those directories. If you access the directory the question mark will vanish.
Navigating-remote.png

Navigating on your machine

Navigating on your machine works almost like navigating on the server. The current local directory and the local directory tree are displayed on the left side of the main window by default.
Navigating-local.png

Synchronized Browsing

If you have an identical directory structure on the local machine and the server, you can enable synchronized browsing. This means that any directory navigation on one machine is duplicated on the other.
To enable synchronized browsing, create an entry in the Site Manager, and on the Advanced tab, ensure that the Default local directory and the Default remote directory have the same structure. Then check "use synchronized browsing," save your settings, and connect.

Directory Comparison

To quickly see differences between files on the local machine and the server, choose View > Directory Comparison, and choose either "compare file size" or "compare modification time." (You also hide identical files by checking that option.) Then choose "Enable."
You will now see color-coded differences between copies of the same file on the different machines. See their meanings here.

Transferring files

You can upload or download a file by double-clicking on it. It will be added to the transfer queue and the transfer starts automatically. To transfer directories and/or multiple files, select them and right-click the selection. Then you can click on Upload/Download in the popup menu.
Transfer-rightclick.png

You can also drag the files from one side and drop them on the other side. To add files to the queue so that they will be transferred later, select them and click Add to Queue from the popup menu. You may also drag the files directly into the queue. Click on the Process-queue-button.png button on the toolbar to start the transfer.
Or, you can click on a file, then drag the file (a box is added to the arrow cursor) to the directory where you want to move it. The directory will be highlighted when you are over it. Let go of the mouse button and the file will be moved to the directory.
Transfer-drag.png


Configuring URL Parameters in Webmaster Tools

Configuring URL Parameters in Webmaster Tools: Webmaster Level: Intermediate to Advanced




We recently filmed a video (with slides available) to provide more information about the URL Parameters feature in Webmaster Tools. The URL Parameters feature is designed for webmasters who want to help Google crawl their site more efficiently, and who manage a site with -- you guessed it -- URL parameters! To be eligible for this feature, the URL parameters must be configured in key/value pairs like item=swedish-fish or category=gummy-candy in the URL http://www.example.com/product.php?item=swedish-fish&category=gummy-candy.





Guidance for common cases when configuring URL Parameters. Music in the background masks the ongoing pounding of my neighbor’s construction!


URL Parameter settings are powerful. By telling us how your parameters behave and the recommended action for Googlebot, you can improve your site’s crawl efficiency. On the other hand, if configured incorrectly, you may accidentally recommend that Google ignore important pages, resulting in those pages no longer being available in search results. (There's an example provided in our improved Help Center article.) So please take care when adjusting URL Parameters settings, and be sure that the actions you recommend for Googlebot make sense across your entire site.




Written by Maile Ohye, Developer Programs Tech Lead

W3C Hammers Out the Details of CSS Variables

W3C Hammers Out the Details of CSS Variables:



The mythical Jackalope variable surrounded by CSS bunnies. Image: Wikimedia

The W3C’s CSS Working Group, the standards body that oversees the CSS specification, is getting closer to defining one of CSS’s most requested features — CSS Variables. However, if you’ve been dreaming of SASS or LESS style power without the preprocessor, the new CSS Variables draft might leave you scratching your head.

Variables used to be one of the most requested features for CSS, particularly from programmers accustomed to languages with variables. But, between then and now, CSS preprocessors like SASS and LESS have largely filled the role by offering variables (and more). Still, SASS and LESS are not CSS.
By the same token, what’s being proposed under the name CSS Variables is not what most developers would think of as a variable. Daniel Glazman, co-chair of the W3C CSS Working Group, calls the new variables “Inherited User-Defined Properties.”

In fact what is being proposed are custom properties which use a function to access the value of the properties later — more of a Mutator getter/setter pair than a directly accessible variable.
When variables were first proposed many assumed the syntax would look something like SASS or LESS, roughly like this:
$foo = myvalue;

/* and then */
.selector {
    color: $foo;
}
We showcased the actual syntax back when WebKit first landed preliminary support for variables, but here’s a quick refresher:
:root {
    var-header-color: #06c;
}
h1 { background-color: var(header-color); }
The first rule is the new variable syntax and defines a property named “var-header-color” on the root element. Then you can then access that value throughout your stylesheets with the var(header-color) syntax.
Why not use the more familiar PHP-like “$var” syntax? For one thing this proposal makes it easier to understand the cascade. See Glazman’s blog for more details on how variables will inherit. Tab Atkins Jr, a Google representative at the CSS Working Group, explains another reason to switch to the new syntax: “If we use $foo for variables, we’ll be unable to use it for future ‘variable-like’ things.”
So what are “variable-like” things? Atkins continues:
For example, if we do define an alternate form that are more SASS-like (can be used anywhere, but are global; more “macros” than “variables”) we’d have to use some other glyph for them. That’s suboptimal. More specifically, if we ever do some sort of “variables” in selectors, we must use a compact form like $foo or something. Anything else is unusable, I believe.
So we have our variables in CSS, but with a syntax that’s not quite the way many developers wanted it. As Glazman writes, “Before shouting, don’t misunderstand us: we clearly see the simplicity, readability and maintainability of the $foo syntax… we do understand why Web Authors prefer a compact and simple notation like $foo but we have decided it comes at a too expensive cost right now.”

If you’re really put off by the new syntax, take comfort in the fact that the variables draft remains just that — a draft spec. Glazman and Atkins both leave the door open to adding $foo style variables back to the spec. As Atkins writes, “If we don’t find anything that needs this kind of syntax, then we can go back to letting Variables consume this kind of syntax for its own use.”

Sunday, August 19, 2012

Link to Look Before You Lock


FOR IMMEDIATE RELEASE:
VL Digital Marketing | VL Automotive Marketing
info@vlautomotivemarketing.com
http://vlautomarketing.com
Look Before You Lock
Raleigh, NC Aug 20, 2012
Vl Automotive Marketing will be including links and images to the national “Look Before You Lock” campaign in their national automotive blogging network, which provide research information for new and used car buyers.
The Department of Transportation and the Department of Health and Human Services are now teaming up to educate people about the threat. They're joining a campaign called "Where's Baby: Look Before You Lock."
The 'Look Before You Lock' campaign reminds parents to double-check their vehicles, before they walk away. The government launched a crackdown Friday morning on children being left inside hot cars. This follows the heat-related deaths of at least eight children nationwide in just the first week of August.
We will be providing an image and link to the Look Before You Lock web site at http://vltiny.com?p/19jp275 on every blog that is posted this week and our CEO Manny Luna has been very passionate about getting the word out. We encourage all of our Automotive Dealers and staff to go to the national site and download poster, stickers, and other material and give them out to help keep parents aware” says VL Digital Marketing's Chief Technology Office, Morgan Todd.


VL Digital Marketing | VL Automotive Marketing
info@vlautomotivemarketing.com
http://vlautomarketing.com
For additional information or a sample copy, Contact: info@vlautomotivemarketing.com
Manuel “Manny” Luna, CEO and Founder of VL Automotive Marketing, has been involved in the automotive industry since 1988. His career has spanned automotive sales, management, finance, and Internet marketing management. Manny is known for his energy, tireless work ethic, his passion for the automotive business, and his ability to see beyond the box, all of which has led him to build VL Automotive Marketing from the ground up
# # #


Google Finally Rolls Out Vanity URLs for Google+

Google Finally Rolls Out Vanity URLs for Google+:
Recently Google started offering vanity URLs for Google Plus. The announcement stated that the vanity URLs will start being rolled out with a few verified users but it will soon become available for all users. Finally this is what many of us have been waiting for with Google+ no more massive strings of numbers in your URL now you can just have a simple short URL like plus.google.com/yourname and this will make Google Plus Marketingmuch more convenient now.
Besides letting you get rid of the number string and add your personal username Google is also dropping the ‘plus’ from the URL to make it even shorter. For example we would be able to change our Google+ URL to google.com/+wikimotive or Wired could use google.com/+wired. This should make it much easier for brands to optimize their Google+ and do better branding, not to mention you can actually write out your profile URL now and it will be recognizable and easy to remember.
Some brands as well as a few famous people have already taken advantage of these new URLs. Among them are Hugo, Ubisoft, David Beckham, and Hugh Jackman. Not to worry though your old link back to Hugh Jackman’s profile will still work as of this writing.  Hopefully Google will continue to forward the old URL scheme to a new one if you set it up, so that you don’t have to go scouring the internet for any Google+ link you may have buried somewhere and change it.
Original Post about vanity URLs for Google Plus posted on Wikimotive's blog under the title Google Plus Introduces Vanity URLs

Tips to Optimize Your Dealership’s Google+ Page


Tips to Optimize Your Dealership’s Google+ Page:

So you've created your dealership's Google+ Brand Page in preparation for the move from Google Places to Google Plus Local...but now what?  Here are some tips for optimizing your dealership's Google+ page:


Keep SEO in Mind: When putting information into the "About Us" tab, use relevant keywords. Also use these terms when adding content to your page to help search engines (especially Google) understand what your page is about.  Use "Recommended Links" to send traffic to your dealership's blog and other social media profiles.

Have Content Ready: Google recommends having 10-20 posts on your Google+ business page before you really start promoting it.  Recent posts is one of the things they use to judge the quality of your page, so having it full of content once people start visiting it will help.

Promote Your Page: Add a Google+ button everywhere you promote your other social media accounts. Allow your website visitors to +1 your inventory and dealership from your site.

Incorporate your +1's into AdWords: Google will pull your +1 count into your AdWords campaigns if you link them.  Within your AdWords account, click the Ad Extensions tab. Select "Social Extensions" from the "View" menu, and enter the URL to your Google+ page.  Talk to your dealer website provider or PPC vendor to get more information.

Make sure your dealership's Google+ page is optimized and ready for the combination of Google Places into Google Plus Local.

Wednesday, August 15, 2012

OWASP Xelenium: Security Unit Tests

OWASP Xelenium: Security Unit Tests:

(from V.Vasanth)

Hello OWASP Friends,

Warm Greetings!!

Today, I would like to introduce you all to my humble effort called ‘OWASP Xelenium’, which helps the user in identifying the security testing threats present in the web applications.

Xelenium is an automated security testing tool that uses Selenium, leading open source test automation tool, as its engine. Xelenium accepts very limited inputs from user and tests the application using the predefined automation procedure.

Current version of Xelenium identifies the Cross Site Scripting threats present in the web application. In subsequent versions, Xelenium will be enhanced to identify other leading security threats.

First version of Xelenium was published on June 22nd, 2012, and second version was published on 6th August, 2012. Till now, around 4000 downloads were happened.

You can find more info here:

In the next version, I am planning to enhance the UI of Xelenium from Java Swing to Java FX. Also, I am looking at the possibility of introducing the enhancements to handle DOM Based XSS.

I would encourage you to use this solution and pass on your comments about it.
Hope this solution helps you in some way. Looking forward for your comments!!!

Thank you!!
V.Vasanth

Tuesday, August 14, 2012

Effective SEO tactics




Through effective SEO tactics, you can improve your search engine rankings for important terms, gain more traffic and do more business.

Search engine optimization techniques focus on increasing the organic, or natural, traffic that you receive based on your ranking within the search engines.

While each search engine uses its own algorithm for determining the ranking of every page that is indexed, it is possible to increase your rankings by making your site informative and visible via both on-page and off-page techniques. Sites that are designed with ease-of-use and quality information in mind tend to do better than those built sloppily and without a solid plan.

It really doesn't matter what type of website you have, whether it's personal or geared towards your business. Incorporating search engine optimization techniques into the creation and ongoing upkeep of your site will ensure you receive higher levels of traffic and, ultimately, greater success.

Kill the Myth:
The world of web design is complex and fraught with misinformation. Chief among them is the belief that most web designers have a solid understanding of Search Engine Optimization (aka SEO). In truth, the exact opposite is true. Most web designers know very little about SEO and this includes those with college degrees in Website Design.

Normally, they weren't actually trained in the art and science we call SEO. They were told that work should be performed by an individual/company that specialized in Search Engine Optimization.

web designers are "artsy" by nature and not best suited to things that are highly technical. That's not to say that there aren't some who are good at both, just that it's probably the exception rather than the rule.

The real problem is that most clients know little about the web design process and mistakenly believe that their web designer has optimized their site for the search engines. As many web designers forget to discuss the topic with their clients, the site is put live without being optimized.

To improve your chances of getting a beautiful website that is well-optimized for the search engines, it's important you understand these three factors:

Web Design

In this phase, your web designer will work with you to create a nice looking website according to your project specifications. This part is primarily graphic in nature but it does form the foundation of the entire project.

Website Coding

After the website graphics are finished, the design will be coded such that it can be understood and displayed by browsers like Firefox, Internet Explorer, etc. Some web designers will use a program like Adobe's Dreamweaver to code the design while others will hire a web coder to do the process manually.

As a general rule, designs that are "hand coded" by a code specialist are more likely to follow the standards set by the Worldwide Web Consortium (aka W3C). They are also more likely to be "lighter" in their design, which is beneficial to SEO.

Search Engine Optimization

After the site has been designed and coded, the work can begin to optimize the website for the search engines. This is best performed by a SEO Specialist.

The SEO Specialist's job is to work closely with the client in an effort to understand their business and what outcome(s) they are looking to produce. With this information, the will begin the process of keyword selection and set about optimizing each page of the site.

In addition to the work performed to the site itself, the SEO Specialist will perform a variety of "off page" tasks. These tasks may include linkbuilding, social bookmarking, the creation of citations as well as many others.

As you can see, successful web design is a rather complex topic and frequently requires the work of several specialists. Armed with this information, you'll know what questions to ask to ensure your project is completed to your satisfaction.

After all, a beautiful website that receives little visitors won't help you achieve much in the way of new business.

Monday, August 13, 2012

Winning the Video Thumbnail in Google Universal Search

Winning the Video Thumbnail in Google Universal Search:

Posted by mybinding1
Have you noticed that more and more video results are showing up in Google search results? Everywhere I turn, it seems that Google is providing me with options of videos to watch on the first page of their search results. As a user, I appreciate the video content and will often click on the video results. As a marketer, I am incredibly jealous of those placements and am constantly searching for ways to capture that traffic for my site. This post highlights the five most important factors I've found that play the largest role in when and where a thumbnail is awarded.

1. Index Status

This may seem like a no-brainer, but if your videos are not included in the video index, then you will not be eligible for the video thumbnail. That makes getting the video content on your site indexed your first priority.
If you want to check to see if your videos are included in the index, simply do a site search for your domain and click the Videos tab in Google. Below is an example of the list of videos included in the index for SEOmoz. As you know, the site feature of Google is not entirely reliable (not everything will show). However, it does give you an idea of the videos that Google includes in the index along with their thumbnails, titles, and descriptions, which can be incredibly helpful.
Video Index for Seomoz.org
If you have a video sitemap submitted inside of your Google webmaster tools account, you will be able to see the number of videos that you submitted along with the number of videos that have been indexed. Again, I have found the numbers to be less than comprehensive, but it is another tool to use in checking whether your videos have been indexed.
Getting your videos indexed by the search engines doesn’t have to be difficult. There have been some awesome blog posts here on SEOmoz about that, including this one titled An SEO's Guide to Video Hosting and Embedding and this one called Video Sitemap Guide for Vimeo and YouTube. There are also some excellent resources directly from Google, such as this section of their Webmaster help on Video Best Practices. Here is a screenshot of that page from their site:
Google Video Best Practices
Check these resources out when you have time, they are definitely worth the read. If you are struggling with getting your videos indexed, you might want to consider using a hosting provider with built in SEO features such as Wistia (they will take care of most of the heavy lifting for you). In the meantime, here is a quick overview of what you need to do in order to get videos in the index.

  1. Create and submit a video sitemap: Make sure that your sitemap includes a unique title, description, embed location, thumbnail, and content location for each video on your site. The keyword phrase and description should match with the content on the page where you have embedded your video.

  2. Embed your videos using a simple SEO friendly embed code: The embed code that you use on your page needs to be SEO friendly. Google needs to be able to verify the information from your sitemap entries to ensure that the video is actually embedded on the page and that the information is accurate. Most SEO friendly embed codes will include all or most of the information from the sitemap. However, several hosting providers are also starting to integrate schema.org info into embeds to make information even more visible to the search engines.

  3. Get your page found: Standard SEO principles also apply in video SEO. Googlebot needs to be able to find the page where you have embedded the video and in order to get that page to rank, you are going to need to make sure that it has pagerank passed to it through internal linking, external linking, or both.
If you are embedding your videos on pages that are already indexed or on a domain that is regularly crawled by googlebot, it shouldn’t take long for you to see new videos show up in the index (1-3 days for our site).

2. Competition

Once your videos are included in the index and eligible for video thumbnails, the next major factor to consider is competition. Winning the video thumbnail result is highly dependent on how competitive the search term is for which you are trying to rank. If you want to beat out the competition, here are a few things to consider:
Are you competing with your Video Hosting Provider for the thumbnail result? If you are embedding videos from YouTube, Vimeo, Metacafe, or other public video sharing sites onto your site, you are fighting an uphill battle to win the video thumbnail. Until about a year ago, it was difficult to get Google to index these videos on your site. Now they will index them (in the case of YouTube you don’t really even need to submit a video sitemap). However, given the choice between your site and YouTube, Google seems to choose YouTube 9 times out of 10. The same is true to a lesser degree from sites such as Vimeo or Metacafe. For this reason, you are really better off hosting your videos with a hosting solution such as Wistia, Vzaar, Brightcove, Limelight, or using a custom player such as JW Player. Phil Nottingham provided a great overview of the features of these different options in the blog post I linked to above. Here is a screenshot:
Paid Hosting Package Analysis
If you have your video hosting situation figured out, the next thing you need to figure out is what type of competition do you have on YouTube? Currently, the vast majority of video thumbnail results that are awarded seem to be given to YouTube videos. It probably has something to do with the fact that they are the largest repository of video content on the web. However, it doesn’t hurt that they are owned by Google. Topics that have large amounts of high quality video content on YouTube will be very difficult to crack using your own website. Keep this in mind when choosing key phrases and creating content.
Are you competing with yourself? If you create a lot of video, the temptation is very strong to distribute it everywhere (YouTube, Vimeo, Metacafe, etc.). This is a valid strategy for some companies. However, it is important to note that you will most likely be competing with your own content on these platforms for the video thumbnails. At MyBinding we have a huge YouTube channel, and we run into this problem all the time. Our YouTube videos outrank the videos on our own site. Ultimately you need to decide whether that is worthwhile for you or if you want to try and attempt to rank on different platforms for different keywords. This decision is going to be based on your business case and is going to vary from company to company.
Finally, remember that you are competing for a space on the first page of the Google search rankings. If the page where you embed your video doesn’t deserve to rank on the first page of Google for your chosen keyword, then winning the thumbnail will be difficult. That isn’t to say that video results don’t jump up the rankings occasionally. However, it is far easier to try and get a thumbnail added to an awesome page that is already ranking than it is to get a weaker page to skyrocket in the rankings.

3. Keyword Intent

It is difficult to define exactly which key phrases will qualify for video rich snippets in Google universal search results. It appears that virtually any key phrase could be awarded a video thumbnail. However, certain phrases are far more likely to have video results than others. The best way to think about this is to consider keyword intent. Search terms that include words such as demo, demonstration, review, tutorial, video, test, lesson, or how commonly return video search results. Google has determined that these words represent “intent” by the searcher that fits with video results. These type of search terms tend to be the easiest to dominate with video thumbnails.
Below is a search results page for the term “Wire Binding Machine Demo” (something from my industry and probably not exciting to most of you). However, you will notice that the first three results are all videos (one from Metacafe and two from YouTube). Of the other results on this page, four others are video related.
Wire Binding Machine Demo SERP
Recently, I have also noticed that more and more specific product names are also returning video results in universal search. I suspect that moving forward, Google will continue to expand the search results that receive weighting for video thumbnail results. That being said, it is always a good idea to stick with the terms that are more likely to produce video results.

4. Page Placement

According to Google’s best practices for video, they are looking for you to “Create a great user experience on your video pages.” Specifically, they state that they are looking for sites to create a standalone landing page for each video. With this in mind, the page where you embed your video should only have one video on it and that video should only be embedded on one page on your site (again don’t compete with yourself). The page should also include descriptive text, title, captions, and other information to help make your video stand out. Google can’t watch your video (yet), so they will often rank it based on the other information surrounding it on the page. Adding other media elements such as images along with text will not only provide a better user experience, but will also help you to rank your videos better.
Here is the exact wording from Google on this issue:
Great Video Experiance Google Guidelines
In addition to the content on the page along with your video, Google has also stated that they are looking for a “Prominently Placed” embedded video player on the page. As we know from the Page Layout algo update in January 2012, Google is able to understand the placement of various elements on the page and use that information as a ranking factor. If at all possible, look to place your videos towards the top of your blog posts or pages.

5. High Quality Relevant Thumbnails

The final element to consider when trying to win the video thumbnail in universal search is the thumbnail itself. You have the opportunity to define the thumbnail for your video in your video sitemap that you provide to Google. When it comes choosing a thumbnail, there are a few things to keep in mind:

  1. It should be high quality. Google’s guidelines says at least 160x90 and up to 1920x1080. I suggest going with a 16:9 aspect ratio.

  2. It should be representative of your content. Google is looking for thumbnails that reflect the content of your video. If your thumbnail is generic or unrelated to your video, it is possible that you may have problems keeping your videos in the index.

  3. It should be unique. Using the exact same image for multiple thumbnails is similar to trying to include the same video twice in the same video sitemap and can cause indexing problems. Make each thumbnail unique and save yourself the hassle.

  4. Choose your thumbnail with CTR in mind. This is your best chance to help yourself and get users to click on your content. Make sure that your thumbnail is awesome and that users will want to click on.
These are five of the most important factors that I have noticed in attempting to win the video thumbnail in Google. Have you seen others? Are you having success in these areas? Leave a comment and let's help each other get better in the area of video SEO.

Wiredwizrd

Morgan Todd Lewistown, PA

Experienced Information Technology Manager with a strong knowledge of technical guidance, IT best practices, security protocols, team leadership, and analyzing business requirements.
Google