Showing posts with label Google. Show all posts
Showing posts with label Google. Show all posts

Monday, 1 June 2009

New SEO Friendly Search Engine Captures Niche Market

Search Engine SEOENG(R) and other small tech startups break new ground in Search - carving out the visionary niche markets of the future, while Search Giants enhance their current industry-dominating Search solutions by increasing Search 'intelligence'.

Bradenton, FL (PRWEB) - A new storm is brewing in Search 2009, and Search Engine SEOENG(R) is rapidly gaining traction in a highly saturated industry with its unique SEO Friendly Search Engine. This technology startup has developed a Search Engine with a transparent and navigable interface where users can, for the first time, see precisely how a Search Engine analyzes and ranks a Website. SEOENG(R) is taking Search in a new direction, by opening its arms to the very industry that Search Giants have fought so hard to keep at arm's length. A new market for Search, directed at Website Owners, Web Designers, and SEO Experts is proving to be a prosperous and exciting venture for this tech company.

Competition is alive and well in the Search Industry, as top Search Engine Google(TM) recently announced its new Google Labs project, which according to the Official Google Blog, is called 'Google Squared', and is Google's attempt at delivering a more structured result to a user's question. Search Giant Yahoo!(R) is also talking about its new vision of Search, which is called 'WOO' (Web of Objects). WOO is Yahoo's take on the next generation of Search, which some believe involves a graph of 'things', not just Webpages. And finally, Microsoft(R) is rumored to be releasing 'Bing(TM)' (formerly Kumo), which according to the Website is a new version of a Search Engine called a 'Decision Engine'.

After more than a decade of rule by Search Giants, SEOENG(R) and other small startups have staked new land in the World of Search. According to Search Engine SEOENG(R), "We are entering a historic time that is very different from the past ten years, because no longer is the competition about delivering the best '10 blue links'. It's all about the 'Next Generation of Search' and the many niche markets that Search is now exposing due to the rapid decrease in costs of infrastructure."

The largest of these niche markets appears to be Semantic Search. Semantics is the study of meaning, and in Search this means not only delivering relevant results, but attempting to infer the 'reason' behind the user's question. Because computer hardware is much less expensive these days, Semantic Search and other new niche markets do not require large investments in infrastructure upfront, and some smaller Search Engine startups, including SEOENG(R), are entering the race and reinstating that spark of creativity and competition that some might say has been lacking for some time.

In addition to SEOENG(R), Wolfram Alpha(TM), is a new Search Engine startup whose Website promises '...an ambitious, long-term project to make all systematic knowledge immediately computable by anyone.' Wolfram Alpha, created by Stephen Wolfram (the inventor of the famous technical and scientific tool suite Mathematica(R)), may have found its niche as the next premiere research tool for scientists and engineers. Other niche markets in Search also exist, and companies are quickly starting to take claim.

SEOENG(R) Co-Founder and President, Marketing & Operations, Maura Stouffer, recently stated that "One only has to look at our tiny startup SEOENG(R) , an acronym for 'Search Engine Optimization Engine(R)', to see the potential available in new Search technologies. We have created what none of the Search Giants have even considered - an honest attempt at being 'friendly' to Webmasters and SEOs. Sure, there were Google's Webmaster Tools and Yahoo's Site Explorer, but did Google and Yahoo! really let users in to their Search Engines to see how things were working? How things could be improved?" SEOENG(R) is the creation of talented individuals from Carnegie Mellon and Cornell University, and promises to be the first SEO Friendly Search Engine for Webmasters, SEOs, and basically anyone who owns a Website. A Search Engine that shows the World how it ranks a Website by disclosing where the internal penalties and deficiencies lie, and how they can be fixed to improve placement on Search Engine Results Pages (SERPs). As more and more innovation begins to take place, new ideas will bring new direction to Search, and that can only mean one thing: In Search, good times are ahead for both businesses and end-users!






Reblog this post [with Zemanta]

Read more!

Bookmark and Share

Friday, 24 April 2009

Google patches "severe" Chrome bug : PC Pro

Google ChromeImage via Wikipedia

Google has patched a bug in its Chrome browser that allowed attackers to perform cross-site scripting attacks.

The flaw was discovered earlier this month by an IBM security researcher and was patched last night, with the release of Chrome version 1.0.154.59.


"An error in handling URLs with a chromehtml: protocol could allow an attacker to run scripts of his choosing on any page or enumerate files on the local disk under certain conditions," Chrome program manager Mark Larson explains.

"If a user has Google Chrome installed, visiting an attacker-controlled web page in Internet Explorer could have caused Google Chrome to launch, open multiple tabs, and load scripts that run after navigating to a URL of the attacker's choice. Such an attack only works if Chrome is not already running."

Read more


Reblog this post [with Zemanta]

Read more!

Bookmark and Share

Tuesday, 17 March 2009

Thief Uses Google Earth To Nick Lead Roofs

In an incident which further sparked concerns over the misuse of Google Earth application for felonious purposes, a thief stole lead worth a whopping £100,000 from the roofs of the buildings, by using the application to get the detailed view of these listed buildings.

Image representing Google Earth as depicted in...Image via CrunchBase


Berge used the application to figure out lead roofs with their darker colours, and made £44,500 by selling around 44.6-tonnes of lead during his six months long spree, which started from September to February.

The roofs of Sutton High School for Girls, Croydon Parish Church, and Honeywood Museum in Carshalton were all looted. Berge is sentenced to eight months in jail, suspended for a couple of years, after he confessed involvement in as many as 30 offences.

The court was told that Berge, along with his accomplices, meticulously planned the thefts, as he went to site equipped with the necessary equipment, such as ladders and abseiling ropes, to plunder the roofs, and even stole a car to make a speedy escape.

Read more...

Read more!

Bookmark and Share

Using SEO to Manage Your Online Reputation

Is there a page you don't like about your company on the web? The best way to "take care" of undesirable material with a high organic ranking is through search engine optimization, writes SEOmoz.

Image representing Google as depicted in Crunc...Image via CrunchBase

Google has posted advice on how to remove unwanted pages, but its strategy is mostly to contact the person who put up the unwanted webpage and trying to convince them to modify and maybe remove it, with the option of taking the request to court if the page contains something illegal.

But there is a third method – left wanting in Google's response - that involves using SEO to help with reputation management. Unfortunately, it is neither easy nor cheap.

The three main components of an SEO-fueled reputation management campaign:

1. Identify which keywords produce prominently listed and undesirable results.
Create content on multiple sites that will outrank the negative content, keeping in mind that Google generally only lists a maximum of two pages from a single domain on a given results page.

2. Optimize those pages with content & links to achieve rankings higher than the negative content, thus "pushing it down" to the 2nd page of results (or further).

3. To create content with the aim of outranking a negative result, leverage as many positive "pre-existing" conditions as possible.



Read more...

Read more!

Bookmark and Share

Tuesday, 24 February 2009

Ask.com Joins Canonical Trio

It looks like the precedence has been set and Ask.com has opted to follow Google, Yahoo, and MSN in their quest to make issues raised by content duplication a thing of the past.

I recently wrote a post showing how website owners can take advantage of the new 'Canonical Links' to extract maximum value out of duplicated content.


Yesterday Ask.com announced their intention to join the big three through their official blog, in which Yufan Hu of Ask.com says: "The 'canonical' feature represents a timely, relevant, and positive partnership between major search engines. It is a step to ensuring more consistency with regard to treatment of duplicates among all of the engines. It will also put more control into the hands of site designers over how their sites are represented within the search indexes.

You can read the full announcement in their blog post entitled 'Ask is Going Canonical'

Read more!

Bookmark and Share

Friday, 20 February 2009

Canonical Links and Content Duplication

For a long time now, search engines have been implementing stringent measures to stop website owners manipulating rankings through content duplication. The way the Internet has evolved over the recent years however, has meant that some sites have been running the risk of being unfairly penalised for content duplication caused more by the technology they are using rather than their intent.

In an effort to address this, the big three (Google, Yahoo, MSN) decided last week to support a common standard which aims to eradicate duplicate URLs within a site.


Image representing Google as depicted in Crunc...Image via CrunchBase


The standard being introduced is very simple, and consists of no more than nominating a single URL (or "canonical" location if you're a search engine) and defining this on all other URLs where the same content is replicated.

Take the following URLs as an example:

http://www.my-company.com/my-products/product-details.aspx?id=0101&source=google
http://www.my-company.com/my-products/product-details.aspx?id=0101&source=yahoo
http://www.my-company.com/my-products/sun-glasses.aspx

These three URLs will display exactly the same content even though the page has three very distinct URL locations based on the origin of the visit and its optimisation.

The agreed standard requires no more than the following syntax being implemented within the <HEAD> section of the two dynamic URLs.

<link href="http://www.my-company.com/my-products/sun-glasses.aspx" rel="canonical"/>

Doing this, will make search engines aware of the preferred URL and pass on any PageRank and link equity the dynamic URLs may have gained across to the chosen URL.

Google guru Matt Cutts explains Canonical Linking on an interview for WebProNews. You can also find out more directly from Google, Yahoo, and MSN.

Read more!

Bookmark and Share

Thursday, 2 October 2008

Is Google Chrome In Need of a Polish?

I don't know about you, but after becoming used to Google's highly useful and free tools, I've been left feeling somewhat disappointed with Google Chrome, their new browser.

Google ChromeImage via Wikipedia


I've been an advocate for all things Google, since the brands infancy, and right up to now I have always been wowed by their tools and they have always found their way into my shortcuts.

Google Chrome however, has left me wondering what the fuss has been all about. Most of the functionality we've all become used to of basic browser seem to have been missed by their developers, and there isn't any additional ground breaking functionality which could make up for the loss.

After eagerly awaiting what could have been one of their biggest products to date, I've ended up going back to my original default browser, at least until the guys have had a chance to give Google Chrome a good polish.

Read more!

Bookmark and Share

Tuesday, 19 August 2008

Prosperous Online Prospects

Understanding your market is arguably the most important aspect behind the success of your site, and understanding the impact this could have on your search engine strategies is equally as important.

But you're probably thinking that this level of information is likely to cost a small fortune and that you will only be able to obtain this by involving a specialised agency.

Well, think again...


Image representing Google as depicted in Crunc...Image via CrunchBase


Google are once again ahead of the game and are currently testing AdPlanner, a tool which will turn the market on its head when its finally released.

Even though this tool is primarily aimed at the Advertising world, the ability to identify the nature of your audience and then to provide you with a list of other sites catering for the same group of people, opens up a world of partnership opportunities. Just think of all those highly relevant inbound links and you'll start to see value this could add to your SEO campaigns.

So, will it cost you an arm and a leg to do your market research in the future? Not if Google can help it!

This tool is currently only available to a selected few (OK, so I'm boasting a bit!), but there's plenty of information about the ins and outs of AdPlanner directly from the Google guys... Enjoy!

Read more!

Bookmark and Share

Wednesday, 25 June 2008

Bridging the Analytics Gap

Google Analytics have recently introduced a handy little "benchmarking" feature which let's you compare your site against your industry's overall performance.

Image representing Google Analytics as depicte...Image via CrunchBase


So you decide to launch a site, you are confident that over the next 6 months you'll increase the traffic by 50%, and then double it within a year. But is that good enough? How can you really tell? What if everyone else is achieving 4 or 5 times that? Will you survive in a market where your competitors are going at 100mph?

This information is now readily available as part of Google Analytics service, and for free! - I use an enterprise level analytics tool, the cost of which would probably be enough to help solve the financial crisis of a small country by the way, not yet offering this level of information (though I'm sure they will).

In true Google style, Analytics is now effortlessly taking full advantage of the gap between "site-centric" and "market-centric" solutions.

Read more!

Bookmark and Share

Tuesday, 24 June 2008

Google Ads through Yahoo!?

I was surprised to see Yahoo! getting into bed with Google and setting up a deal which sees the big G's ads cropping up within Yahoo's results.

Image representing Yahoo! as depicted in Crunc...Image via CrunchBase


Initially this is only set to run across the pond, but it makes me think this could be the beginning of the end for Yahoo!

Let's face it... when you get in to a situation where you're ready to let your biggest competitor have a presence in your site, how far must you be from hanging your gloves up? Microsoft's u-turn on the original deal must have left Yahoo! even more vulnerable than initially thought.

Read more!

Bookmark and Share

The Only Way is Up...

Ok, so I'm showing my age a bit, but what can I say... I'm in a good mood and Yazz's classic hit seems more than appropriate for the occasion.

Moving up IsfallsglaciärImage by Aleksi Aaltonen via Flickr


My site seems to be in Goggle's good books once again, and traffic is slowly (but very surely) heading in the right direction.

The main page has made it back into the first page, and is steadily crawling it's way up to the number one slot for the brand's name, where it rightfully belongs.

I can once again focus my attention on some of the activities, which had up to this point, understandably been relegated to the lower tiers.

Read more!

Bookmark and Share

Monday, 23 June 2008

Fragile Google Rankings

Over the past few weeks I've been experiencing a drop on traffic on one of my websites, and today finally I found the fix. It just goes to show how easily Google can turn their back on you if their bot is not being looked after.

GLASGOW, UNITED KINGDOM - APRIL 12: (FILE PHOT...Image by Getty Images via Daylife


So what happened? I hear you ask... Well, it all started with some changes which had been scheduled for release at the beginning of the month meant the site had to come down whilst maintenance work was being carried out.

As the laws of Murphy came into play, the site was payed a visit by Google's trusted Bot, and hey presto, the website was no more than a holding page in Google's eyes.

Confident that Google would be sending its most trusted agent back to the site in a matter of days, I decided to play the SEO waiting game once again (not uncommon in the SEO world).

A week went by and still no sign of getting indexed, which made me wonder if something had gone wrong with the release... could the changes had caused the drop? could the updated or deleted files have been so critical to Google that it simply decided not to come back at all?

Not having direct access to the website files, at this point I decided to have a chat with the project manager to air my concerns and left reasured that none of the files updated as part of that release would have had an impact on Google.

Another two weeks had passed, and all I could do was scratch my head and hope that Google would soon come back to take pitty on the traffic being sent through them (lowest volumes since records begun). Not even updating the sitemap file did the trick.

So what was the problem? I hear you shout... Well, somehow we ended up with a corrupt Robots.txt file, which even though available in the root of the site, was inacessible to Google (and all other search engine bots).

Recreating and uploading this file once again has done the trick... Google is once again seen as a friend rather than an enemy.

I'd be interested in hearing your stories.

Read more!

Bookmark and Share
ss_blog_claim=a4dfb4b9e538ffc83c12431a781cd2c4 ss_blog_claim=a4dfb4b9e538ffc83c12431a781cd2c4