Thursday, November 29, 2007

BBC News Exposes : Hijacked Web Search Results

Recent news on BBC exposed that the hackers have hijacked web search results. A huge operation has come up to raze web searches and trick people into visiting nasty websites have been let down.

For detailed information visit the link below
http://news.bbc.co.uk/2/hi/technology/7118452.stm

Wednesday, November 28, 2007

Ever Wondered Why To Implement Robot.txt File

Have you ever wondered how to check your robot.txt file that weather it’s implemented on your website or not. So to navigate that it’s very simple just open your web browser and enter www.yourdomain.com/robots.txt to view the contents of your robots txt file.

You must be wondering why to implement robot.txt file on website the answer is simple
If your website doesn't have a robots.txt file then search engines will automatically index everything they can find on your site.

There are two important official commands for the robots.txt file: User-agent and Disallow. Do not use more commands than these.

  • Don't change the order of the commands. Start with the user-agent line and then add the disallow commands:

User-agent: *
Disallow: /cgi-bin/

  • Don't use more than one directory in a Disallow line. "Disallow: /support /cgi-bin/ /images/" does not work. Use an extra
Disallow line for every directory:

User-agent: *
Disallow: /support
Disallow: /cgi-bin/
Disallow: /images/

Monday, November 26, 2007

Webinar’s Have Many Advantages: Flexibility To Cost-Effectiveness

Webinar’s is shortform for Web based seminars or you can say Webinar’s are basically a conference room based seminar; where the participants view the presentation through their browser and listen to the speech or lecture through their telephone lines.

Webinar’s are now becoming the new choice in web conference as they are easily reachable to anyone with an internet connection, computer and telephone line.

What are Webinar’s used for?
  • Webinar's are primarily used to train a large number of people
  • conduct large scale meetings
  • build brand and generate sales leads.
  • Additional corporate announcements to focused groups
  • and press conferences.

Thursday, November 22, 2007

Blessing of Web 2.0 : Online Social Network

Online social networks are one new trends of Web 2.0. Online surfing people use them to stay in touch with their friends/colleagues, meet new people, and make work-related connections and much more. A business can make money using a social networking site by advertising their business on it

Open Social by Google is helping people create Web applications that can be used on virtually any social networking site. Other sites are helping Google develop this tool. OpenSocial provides a common set of APIs for social applications across multiple websites. With standard JavaScript and HTML, developers can create apps that access a social network's friends and update feeds.

Mistakes That Effects Your Search Engine Rankings

It takes a lot of time and efforts on optimizing a website it includes different strategies to get higher search engine rankings for website, but sometimes website fails to get preferred results or say website fails to get ranking on top Search engines, just because of some mistakes.


Below are jotted some few common mistakes that has bad effect on your Search engine rankings.

  1. Keyword Stuffing: Using the same keyword again and again or putting same keywords in your keyword Meta tag is known as keyword stuffing and considered as spam by search engines. You must avoid it as it may harm your search engine rankings.
  2. 301 Redirects: Redirect is used to re-route users from one URL to another. From search engine point of view use only ‘301 redirect’ which is the safest method to redirect.
  3. Duplicate Content: always have unique and informative content for users on all web pages, it must be related to your business.
  4. Cloaking: Cloaking is spam technique used by some to display different pages to the search engine spiders than the ones normal visitors see. You should always avoid any form of cloaking as it strongly forbidden by most major search engines now.
  5. Navigation and internal linking: Proper navigation and internal linking also matters a lot. Navigation menu should be easily accessible by users. Make sure that the anchor text linking to pages within your own website is relevant to the target page.
  6. Over-do Optimisation: Over optimization should mot be done as it predicts that your site has been designed for search engines and not for users. It may drop your search engine rankings as search engines are now able to detect over optimized sites so you must avoid over optimization.

Saturday, November 17, 2007

Google Purchased DoubleClick

Google paid $3.1 billion for DoubleClick, which is also the owner of a very large search engine optimization company called Performics. Now that Google has acquired the Internet advertising giant DoubleClick, what are they planning to do with its search marketing company Performics?

Thursday, November 15, 2007

How To Combine SEO with Ajax Based Websites

AJAX is a technology based on JavaScript that can get information flawlessly in the background onto an already loaded web page. First thing to remember before you start making your website is to build your site's structure and navigation using only HTML, Googlebot will be joyous looking at the HTML. Then, once you are done with website designs, link and content, you can flavor up the look and feel of the website by using Ajax as visitors with modern browsers can enjoy Ajax Features.

Design your website for Visitors and Search Crawlers too

One should always create website for Visitors also and not just for search engines. When you're designing your Ajax website, think about your visitor who may not be using a JavaScript-capable browser.


To read more about how to combine Ajax with search engine friendly design. Click the link below. Michael Wyszomierski and Greg Grothaus of Google have jotted down a very interesting article on the use of Ajax and search engine friendliness.


http://googlewebmastercentral.blogspot.com/2007/11/spiders-view-of-web-20.html

Monday, November 12, 2007

How To Optimize A Website Using JavaScript

Search engine optimization of website for better search engine positioning is the most cost- efficient way of web site marketing. Every search engine’s crawler has different algorithms to rank their results. As you prepare to launch your newly designed website you should keep in mind two important points:

  1. Make it easy for the search engine crawler to index your website
  2. Make sure your content is optimized for top rankings

Below are few points that describes How to Optimize a Website Using JavaScript

  1. Adding set of links in the noscript pair of tag would enable the search engine crawler to get to the links at the top of the page! Thus, adding a list of the main navigation links in plain HTML between the noscript can be a useful way of assisting the web crawler in accessing the most important pages of the site.
  2. Place your JavaScript at the end of your HTML file: Having script tags in your header can also be a great pain for your website ranking so the easy way out to this can be by moving the script to the bottom on the page.
  3. Putting Breadcrumb links and bottom navigation links so that even if the search engine spiders are unable to follow the links built into the JavaScript code, the crawlers would be able to navigate throughout the rest of the site following the secondary navigation.
  4. Load JavaScript On-Demand: You can load an arbitrary JavaScript file from any domain using the import function in the Code.
  5. Compress Your JavaScript first, try to make the JavaScript file smaller itself. There are lots of Tools to “crunch” your files by removing whitespace and comments. You can do this by online tools available, but these tools can be finnicky and may make unwanted changes if your code isn’t formatted properly. Also debugging compressed JavaScript can be really difficult because the variables are renamed. So creating a “debug” version of your page that references the original files. Once it gets tested and the page works, pack it, test the packed version, and then deploy.
  6. Creating Library Files: It can also be helpful to combine smaller files into a larger one, especially if they don’t change often. This reduces the number of requests the browser makes.
  7. There should be regular HTML links throughout the body of your page in addition to implementing bottom (or footer), side/top navigation, as well as breadcrumb navigation.
  8. Having a sitemap can be another good way of providing access to the most important links of the website so that a web crawler can follow them and index the whole site.
  9. Select several two-three word keyword phrases and place them within the page title, the Meta tags, and most importantly, throughout the body of the page.

Wednesday, November 7, 2007

Search Cloud - Substitute Way Of Searching

Search clouds are coming up well in the online internet marketing industry. If we analyses this process in more detail then we’ll come to know that Search Cloud is an substitute way of searching through a database proficiently. Each search in database gets associated with a keyword.

If we take a online survey of what exactly is Search Cloud and then it can be summarized that it’s just a graphical demonstration of your search, a cloud of Keywords and small icons with your Keywords in the middle and related terms hanging around it. We can also add this search clouds to the Blogs, these search clouds are provided by Quintura

To have a better knowledge about this concept you can have a look at the link below which has a pictorial representation of the technique called “Search Cloud”

http://www.pandia.com/sew/552-embed-quintura.html#more-552

Friday, November 2, 2007

Truth About Google Adwords Negative Keyword Search Tool

There are many discussions going that the Negative keyword search tool had been removed from Google adwords.

i have found a very interesting point on this query. Actually the negative keyword tool still exists. It has been reassigned to individual keywords, rather it can be said other way around that it has been relocated.

For the full details you can read a article on this link with full snapshots applied over there.

http://www.seroundtable.com/archives/015174.html

Thursday, November 1, 2007

Blogs - Cost Effective And Simple Way to increase Traffic

Blog is basically a online journal related to a particular topic like food, politics, environment etc. Blogs consist of text, images and links to another blogs and websites too related to our website. The results in a blog appear in chronological order. BLOG itself is a website which facilitates users to give their own thoughts and suggestions

Benefits of blogging
  • Blogs contains categorized content and blogs are mainly rich in content as various readers can post their thoughts and also can update those posts. Crawler mainly admires rich content sites.
  • As blog is related to particular topic. So it is beneficial For web surfers because they will get the required information. So the traffic rate on blogs is more than that of normal websites.
  • Blogs contains the fresh content because blogs content is readily updated. And both the readers and search spiders reward fresh content.
  • Blogs can easily link to each other than the websites. Blogs supports text, images and videos. And more detailed information and opinions about a particular topic. So it attracts more incoming links and traffic.

So it can be summarized that blogs are very important for both the business point of view and seo point of view as blogs are the cost effective and simple way to increase traffic and incoming links to our website.