Tuesday, July 16, 2013

Latest SEO Interview Questions and Answers 2013 for Experienced & Freshers


Q1. Tell me something about yourself ?
Ans. You should tell him very confidentially about yourself your strength and your Aim.

Q2. What you know about SEO ?
Ans.It is process of getting traffic from free and paid listing on the search engine like.

google
yahoo
bing


Q3. How to get top position in SEO ?
Ans. With the help of two factors we get top position in Search Engine
A) ON PAGE OPTIMIZATION

B)OFF PAGE OPTIMIZATION

Q4. What do you know about ON Page Optimization ?
Ans. Where we need to work on On page of our site and we can make changes on title tag, meta tag,site structure and site content, links and keyword.And On page optimization we need to work also on logical factors.


Q5. What do you know about Off Page Optimization ?
Ans.Off page optimization means we can work on important factors like directory submission,link building,social media and fresh content.And Off page optimization we don't need to work on logical abilities.

Q6. Do you know about latest update in SEO ?
Ans.Yes Sir i know about latest updates in SEO
1) Panda
2) penguin 

Q7. What is the latest  Panda update ?                                                              
Ans. The Panda  is for improve the search result in Google.The latest panda version is 2.5.3 . All the focus was transformed on the user and now quality content , proper design, proper speed, proper use of images and videos, content to ad  ratio all mattered more after the Panda update.                                                                         

Q8. What do you know about latest update in penguin ?

Ans. In the September 5 the Google penguin 3 has updated. Mutt cutts says it is the new data refresh. 0.3% english language has effected. Penguin is the code name of the google algorithm. It is first arrived the April 24 2012. It’s goal is decreasing the ranking of the website that violate the Google Webmaster guidelines. This guidelines are violated by using black ghat SEO techniques like keyword stuffing, cloaking etc. Lots of duplicate sites are effected by this update. When it first arrived 3.1% search query has effected. But now only 0.3% has effected. Still it is very much dangerous for the SEO. I think best technique is that only directory submission can help you. It can overcome some damage of your site. But after all it’s very bad news for the world of SEO

Q9. Who is Matt cutts ?
Ans. Matt cutt is the head of Google web spam team.



Q10. What is keyword stemming ?
Ans. keyword stemming is the process of finding out the root word from the  search query..

Q11.What is keyword density ?
Ans. Keyword density make your content stand out in crowd.
        Here is important formula of keyword density
                                                     total no of keyword             *100
                                                  total no of words in your artical

Q12. What are webmastertools ?
Ans. Webmaster tools is free service by google where we can check all the details like indexing data,daily user, stats ,search query,CTR and XML sitemap.

Q13. Which  Important factors makes ON Page Optimization better ?  
Ans. In On Page Optimization there are some important factors like
Title Tag
Meta tag
Keyword
Sitemap
Images
Internal linking
Bread crimp.

Q14. Which  Important factors makes Off Page Optimization better ?
Ans. In Off Page Optimization there are some important factors like

Directory Submission
Artical Submission
Press Release
Blog Writing/posting/creation
Classified Submission
Social  Media


Q15. What is google Sandbox ?
Ans. Google sandbox is an imaginary area where less authoritative and new sites taking place while they are popular and authorized on the web.

Q16. What is LSI ?
Ans. LSI stands for latent semantic indexing.It is data retrieval technique.

Q17. How many characters limits in Title tag ?
Ans. In title tag approx 70 characters we can add.
            <title>primary keyword(70 char)</title>

Q18. How many types of Meta Tags and their characters limits ?
Ans. 2 types meta tags in Seo
Description Meta tag (150 characters limits)
Keyword Meta tag (200 characters limits)


Q19. What Seo tools do you use ?
Ans. Google webmaster tools, Google analytic,keyword research,Alexa,open site explorer.

Q20. How many SEO techniques do you know ?
Ans. 1)Black hat seo
        2)White hat seo
        3)Gray hat seo

Q21. What is Black Hat Seo ? And how many techniques used in it ?          
                                                                                                                           
Ans. Black hat seo is technique in which we use duplicate issues like content,photos, video,hidden links,keyword stuffing doorway pages and many other.It is not good for our site because when Google starts its crawling it considered the site is full of duplicate issues and he penalized the site minimum +30 Google page ranking

                                           Black hat techniques
hidden link
keyword stuffing
doorway pages
incorrect keyword
link farming
mirror side
Note:- For Best seo this technique is not good avoid this technique..

Q22. What is White Hat Seo ? And how many techniques used in it ?
Ans. White hat seo is a technique in which we used fresh,good contents in our site.Never used any kind of duplicate contents in it.It is the best method to get high ranking on the Search Engine..
                                         White hat techniques
Quality contents
Titles and Meta data
Keyword research and Effective keyword use
Quality inbound links
Note:- Best technique for best seo

Q23. What is Bookmarking Sites ?
Ans. Bookmarking sites helps you to getting instant traffic on your site by his powerful social media factor.You can easily bookmarks this site on your favorites list and when it requires you can click on this and you will get this.

Q24.Top 6 Bookmarking  sites name ?
Ans. Top Bookmarking Sites.
Twitter
Pinterest
reddit
stumble upon
Digg
Delicious

Q25. Tell me One URL short links site name ?
Ans. http://www.bitly.com


Q: Define blog, article & press release?


Ans: A blog is referred as an information or discussion published on website or World Wide Web incorporating distinct entries called as posts. Basically, the blog is referred as everything thing where you can include others too. It is more individual in contrast to article and press release. It is also considered as very personal in subject to both style and comprised ideas and information and can be written in the way just like you may talk to your readers. It is also called Web diary or Online Diary.
The articles are concerned with specific topic or event and are highly oriented towards an opinion instead of information. An article is supposed to be more oriented towards showing up opinions, views and idea. Generally, it is written  by a third party or expert of any specific field.

Press Release is related with a specific action or event which can be republished by distinct medium of mass-media including other websites. It should be simple, short and professional. It conveys a clear message or information.

Q26: What are Meta Tags?


Ans: HTML meta tags are usually referred as tags of page data which sits between opening and closing head tags of a document’s HTML code. Actually these are hidden keywords who sits in the code. These are invisible to visitors but are visible and readable by Search Engines.
Example:
<head>
<title>Not considered as Meta Tag, even required anyway</title>
<meta name=”description” content=”Write your description here” />
<meta name=”keywords” content=”Write your keyword here” />
</head>


Q27: Difference between keyword & keyword phrase?


Ans: The keyword term is basically concerned with a one-word term, on the other hand a keyword phrase considered as employment of two or more word-combinations. Therefore, it is very confounded to get high ranking in account of one-word keyword term until the one-word keyword has little online competition. Therefore, this practice is not encouraged to employ. In order to drive more traffic and top ranking in SERP it is recommended to employ keyword phrase.


Q28: Establish a difference between PR & SERP.


Ans: PR is Page Rank which is defined by quality inbound links from other website or web-pages to a web page or website as well as say the importance of that site.
SERP stands for Search Engine Result Page is the placement of the website or web-page which is returned by search engine after a search query or attribute.


Q29: Define Alt tag?

Ans: The alt attribute also called as alt tag are employed in XHTML and HTML documents in context of defining alternative text that is supposed to be rendered when the element can’t be rendered to which it is applied. One great feature of alt tag is that it is readable to ‘screen reader’ which is a software by means of which a blind person can hear this. In addition, it delivers alternative information for an image due to some specific reason a user can’t view it such as in case of slow connection and an error occurred in the src attribute.
For example, the HTML for this image will appear something like this:
<img alt=”you can define alt tag just below the input box of image title while uploading or editing a image.” src=”<http://www.webgranth.com/wp-content/uploads/2012/07/Alt tag.jpg”>


Q30: What do you know about Adsense?


Ans: Adsense is a web program conducted by Google that enables publishers of content websites to cater text, rich media, image, video advertisements automatically which are relevant to content of website and audience. These advertisement are included, maintained and sorted by Google itself and earn money either by per-click or per-impression basis.

Q31: Can you define Adword?


Ans: Adword is referred as the main advertising product of Google which is useful to make appear your ads on Google and its partner websites including Google Search. This Google’s product offer PPC (Pay Per Click) advertising which is a primary module and incorporate a sub module CPC (Cost Per Click) where we bid that rate that will be charged only when the users click your advertisement. One another sub module is CPM (Cost Per Thousand Impression) advertising where advertiser pay for a thousand impression on flat rate to publisher. In addition it also includes website targeted advertising of banner, text and rich-media ads. Moreover, the ad will appear especially to those people who are already looking for such type of product you are offering as well as offer to choose particular sites with geographical area to show your ads.


Q32: What is PPC?

Ans: PPC is the abbreviated form of Pay Per Click and is a advertisement campaign conducted by Google. It is referred as a primary module with two sub module CPC (Cost-per-click) and CPM (Cost per thousand impression) through bidding and flat rate respectively. In CPC the advertiser would be only charged when the user click over their advert.


Q33: What are the aspects in SEO?


Ans: The main aspect in SEO are divided in two class: SEO On-Page and SEO Off-Page.
SEO On-Page includes Meta tag, description, keywords optimization, site structure and analysis, etc.
SEO Off-Page aspect are Keyword Research, unique and quality content, link building through Blog Comments, Blog Posting, Article submission, Press Release, Classified posting, Forum posting.


Q34: What do you know about RSS?


Ans: RSS stands for Really Simple Syndication is useful to frequently publish all updated works including news headlines, blog entries etc. This RSS document also known as web feed, feed or channel that incorporate summarized text including metadata i.e. authorship and publishing dates etc.
However, RSS feeds make the publishers flexible by syndicating the content automatically. There is a standardized file format XML that lets the information to be published once which can be visible to several distinct programs. Also, this make readers more ease to get updates timely by allowing them to subscribe from their favorite sites.

Thursday, September 27, 2012

SEO Tools


Google PageRank Checker Tool

Using our free lookup for pagerank to see query and check the current pagerank as reported by google for a URL. Our realtime pagerank reporting tool will allow you to see what the current pagerank is reported on in Google.

Link Popularity Utility

Determine how your site link popularity is doing with our free search engine optimization links lookup service. Our free tool will query the major search engines such as google, yahoo, and msn and report back on your link popularity.

Pages Indexed by Search Engines

Search engine optimization tool to query the pages indexed on a website in the search engine. The Query Tool will query the major search engines such as google, yahoo, msn and report back how many pages on the website are indexed on major search engines.

Search Engine Results Pages SERP Lookup

Our serps utility is used to check the current search engine results pages. The SERPS Lookup Tool will query the major search engines for keywords and see if a url is found.

IP Address Tool

The IP Lookup Utility checks the ip address you are currently connecting to the server from. It displays information about your web browser, ip address, and information used by web servers.

Whois Lookup Utility

The Whois Information Lookup Utility checks the internet domain registration in central databases and returns back the who-is information for that domain.

Glossary of SEO


Glossary of SEO Terms:

Anchor Text: "Linked Text" Anchor text is the text that you click on to activate and follow a hyperlink to another web page or another web site.

Alt Tag : Alt tag refers to the text that is associated with an image. This originated from when web browsers were not universally available as "Graphical Browsers" so the "Alt text" was a description used to describe the image. Now adays it is also used in the "mouseover" text popup that appears when you move your mouse over a graphical image.

Cloaking: Method by which specific content is served up to the search engine spider that is different then what the normal surfer sees.

CPC: Cost per click. This is a frequent term used in PPC terminology. It refers to the cost associated with each click.

CPA: Cost per Acquisition or also known as Cost Per Sale. This is typically an average dollar amount to the total cost in clicks it takes to convert to a sale.

Conversion Ratio: The conversion ratio is the average number of visitors who visit the site who make a purchase or perform some sort of action. (register as a customer, etc). Conversion ratios vary by the type of traffic you are getting. More "Targetted" traffic typically means higher conversion rates.

Doorway pages: Pages setup specificly for search engines. Once the visitor reaches the page, they are then redirected to another website.

Hits: This generally means ALL requests from a webserver including requests by a web browser for html pages, jpeg's, gif's and other images. Hits is a phrase often thrown around but is generally not very meaningful in quantitfying search engine traffic.

PPC: Pay Per Click.

PageRank: Formula developed by Google to determine a web pages "inbound link ranking" Often referred to as "PR" value.

PageViews: Number of times your webpage was viewed. Includes duplicate views by the same visitor.

ROI: Return on Investment. A quantitative analysis of investment in advertising and marketing budgets and the resulting return on the investment.

SEO: Search Engine Optimization.

SEM: Search Engine Marketing.

SERPS: Search Engine Result Pages.

SPAM: Unscrupulous or unethical means of inflating results. Usually deteriorates the quality of listings and often results in penalties or being banned from a search engine.

Unique Visitors: Total number of unique visitors to your website or web page.

PPC


Defenition:

"PPC" refers to paying for clicks aka Pay Per Click. Generally speaking Search Engines are a business and are trying to generate income. They have bills to pay just like you and everyone else. So how do they make money? They sell listings/advertisements in one shape or another. With PPC you pay a fee that ranges from .5 cents to 5 dollars and up depending on the "Competitiveness" of the term. Overture, and Google Adwords are the 2 most well known PPC services. There are additionally smaller services such as findwhat that also provide PPC feeds.

"SEO" refers to Search Engine Optimization. This generally refers to "natural" or "Organic" results. They are the non-sponsored results that a search engine returns back to the searcher as "relevant"

PPC and SEO combined are often combined referred to as "Search Engine Marketing"

Is PPC right for me?
Well thats a difficult question to answer. Generally speaking we recommend to our customers a balanced approach between natural SEO and PPC listings. Never put your eggs all in one basket. Depending on the competition in your market and your search phrase your cost per click can easily pass $5,1$0,$20 Per Click. That may be a difficult amount to handle for smaller websites who have limited advertising budgets. Even if you do have plenty of money to spend on PPC listings, many times surfers "zone" out of the paid listings and look to the natural listings only.

    PPC Positives

        Fast results. Generally speaking PPC listings can be analyzed, activated and turned on in a matter of days or weeks.
        You get what you pay for. The highest bidder generally speaking achieves the greatest number of clicks.
        PPC Results Analysis can be analyzed on a concrete basis. You purchase 1,000 clicks at .10 cents a click. Out of that you recieve 10 sales at $50 each. So for $100 dollars of paid clicks you achieved $500 in sales. Your ROI (return on investment) and Cost Per Sale is easy to calculate.

    Negatives

        PPC Cost. Costs can add up for Pay Per Click Campaigns. You may be competiting with very large fortune 100 companies who have nearly endless amounts of funds available for advertising and marketing.
        PPC Tracking and reporting. Keeping track of the competition and management of your pay per click campaigns can be a time consuming full time task. Our staff of SEO consultants can provide Pay Per Click Management services to maintain and track campaigns for you and report back on how you are doing. Additionally advantageseoservices has partner agreements with companies that can provide pay per click xml feeds to various search engines and maintain these feeds for you.

   



Natural SEO

On the other hand Natural SEO is a drawn out process that takes 2,3,4 months and beyond to achieve top rankings on highly competitive terms. With a moderate PPC budget that we can manage for you, you will have achieve fast results and more traffic as the turn around time on ppc listings is generally less then 5 business days. Optimizing your site for organic "free listings" will often take restructuring the content of your website.

    SEO Positives

        Strategic Search Engine Optimization Results are typically long lasting although do change from time to time.. You pay an upfront fee and there is no cost per click.
        Sponsored Listings Ignored Many surfers "Zone Out" of sponsored (PPC) results , and look to the "regular listings" for what they are looking for. This is just a natural phenomena, just as some tv viewers respond to commercials and some do not. Both are consumers but get and absorb their information by different means.
        Long term SEO strategy. PPC campaigns are quick fixes to traffic, however you can easily be outbid by other companies with deeper pockets. Focusing on the SEO part of online marketing greatly increases your long term profitability.

    SEO Negatives

        SEO is a Long term process. Changes are not immeadiate and take weeks to several months to improve ranking.
        Page Changes Required You must be willing to change the structure and content of your website and pages. This means directly editing the pages and content of your site. If you are unwilling to make any changes, SEO will likely not work for you.
        Certain web technology causes problems Sites that are heavy on graphics, heavy flash, use javascript or other client side scripting languages exclusively, will have a difficult time of achieiving.

Monday, June 25, 2012

How To Use HTML Meta Tags

Want top search engine rankings? Just add meta tags and your website will magically rise to the top, right? Wrong. Meta tags are one piece in a large algorithmic puzzle that major search engines look at when deciding which results are relevant to show users who have typed in a search query. While there is still some debate about which meta tags remain useful and important to search engines, meta tags definitely aren't a magic solution to gaining rankings in Google, Bing, Yahoo, or elsewhere – so let's kill that myth right at the outset. However, meta tags help tell search engines and users what your site is about, and when meta tags are implemented incorrectly, the negative impact can be substantial and heartbreaking. Let's look at what meta tags are, what meta tags matter, and how to avoid mistakes when implementing meta tags on your website.
What Are Meta Tags?
HTML meta tags are officially page data tags that lie between the open and closing head tags in the HTML code of a document. The text in these tags is not displayed, but parsable and tells the browsers (or other web services) specific information about the page. Simply, it “explains” the page so a browser can understand it. Here's a code example of meta tags:

Search Engine Placement Tips

A query on a crawler-based search engine often turns up thousands or even millions of matching web pages. In many cases, only the ten most "relevant" matches are displayed on the first page. Naturally, anyone who runs a web site wants to be in the "top ten" results. This is because most users will find a result they like in the top ten. Being listed 11 or beyond means that many people may miss your web site. The tips below will help you come closer to this goal, both for the keywords you think are important, and for phrases you may not even be anticipating.

Pick Your Target Keywords
How do you think people will search for your web page? The words you imagine them typing into the search box are your target keywords. For example, say you have a page devoted to stamp collecting. Anytime someone types "stamp collecting," you want your page to be in the top ten results. Accordingly, these are your target keywords for that page. Each page in your web site will have different target keywords that reflect the page's content. For example, say you have another page about the history of stamps. Then "stamp history" might be your keywords for that page. Your target keywords should always be at least two or more words long. Usually, too many sites will be relevant for a single word, such as "stamps." This "competition" means your odds of success are lower. Don't waste your time fighting the odds. Pick phrases of two or more words, and you'll have a better shot at success. The Researching Keywords article provides additional information about selecting key terms.

Position Your Keywords
Make sure your target keywords appear in the crucial locations on your web pages. The page's HTML title tag is most important. Failure to put target keywords in the title tag is the main reason why perfectly relevant web pages may be poorly ranked. More about the title tag can be found on the How To Use HTML Meta Tags page. Build your titles around the top two or three phrases that you would like the page to be found for. The titles should be relatively short and attractive. Think of newspaper headlines. With a few words, they make you want to read a story. Similarly, your page titles are like headlines for your pages. They appear in search engine listings, and a short, attractive title may help encourage users to click through to your site. Search engines also like pages where keywords appear "high" on the page, as described more fully on the Search Engine Ranking page. To accommodate them, use your target keywords for your page headline, if possible. Have them also appear in the first paragraphs of your web page. Keep in mind that tables can "push" your text further down the page, making keywords less relevant because they appear lower on the page. This is because tables break apart when search engines read them. For example, picture a typical two-column page, where the first column has navigational links, while the second column has the keyword loaded text. Humans see such a page like this: Home

Stamp Collecting

Page 1
Page 2 Stamp collection is worldwide experience.
Page 3 Thousands enjoy it everyday, and millions
Page 4 can be made from this hobby/business.
Search engines (and those with old browsers) see the page like this:

Home
Page 1
Page 2
Page 3
Page 4
Stamp Collecting
Stamp collection is worldwide experience. Thousands enjoy it everyday, and millions can be made from this hobby/business. See how the keywords have moved down the page? There is no easy way around this, other than to simplifying your table structure. Consider how tables might affect your page, but don't necessarily stop using them. I like tables, and I'll continue to use them. Large sections of JavaScript can also have the same effect as tables. The search engine reads this information first, which causes the normal HTML text to appear lower on the page. Place your script further down on the page, if possible. The Hiding JavaScript article provides additional information about JavaScript and search engines.

Create Relevant Content
Changing your page titles is not necessarily going to help your page do well for your target keywords if the page has nothing to do with the topic. Your keywords need to be reflected in the page content. In particular, that means you need HTML text on your page. Sometimes, sites present large sections of copy via graphics. It looks pretty, but search engines can't read those graphics. That means they miss out on text that might make your site more relevant. Some of the search engines will index ALT text and comment information. But to be safe, use HTML text whenever possible. Some of your human visitors will appreciate it, also. Be sure that your HTML text is "visible." Some designers try to spam search engines by repeating keywords in a tiny font or in the same color as the background color to make the text invisible to browsers. Search engines are well aware of these and other tricks. Expect that if the text is not visible in a browser, then a search engine may not index it. Finally, consider "expanding" your text references, where appropriate. For example, a stamp collecting page might have references to "collectors" and "collecting." Expanding these references to "stamp collectors" and "stamp collecting" reinforces your strategic keywords in a legitimate and natural manner. Your page really is about stamp collecting, but edits may have reduced its relevancy unintentionally.

Avoid Search Engine Stumbling Blocks
Some search engines see the web the way someone using a very old browser might. They may not read image maps. They may not read frames. You need to anticipate these problems, or a search engine may not index any or all of your web pages.

Create HTML links
Often, designers create only image map links from the home page to inside pages. A search engine that can't follow these links won't be able to get "inside" the site. Unfortunately, the most descriptive, relevant pages are often inside pages rather than the home page. Solve this problem by adding some HTML hyperlinks to the home page that lead to major inside pages or sections of your web site. This is something that will help some of your human visitors, also. Put these hyperlinks down at the bottom of the page. The search engine will find and follow them. Also consider creating a site map page with text links to every page within your site. You can submit this page, which will help the search engines locate pages within your web site. Finally, be sure you do a good job of linking internally between your pages. If you naturally point to different pages from within your site, you increase the odds that search engines will follow links and find more of your web site.

Frames Can Kill
Some of the major search engines cannot follow frame links. Make sure there is an alternative method for them to enter and index your site, either through meta tags or smart design. For more information, see the tips on using frames.

Dynamic Doorblocks
Are you generating pages via CGI or database-delivery? Expect that some of the search engines won't be able to index them. Consider creating static pages whenever possible, perhaps using the database to update the pages, not to generate them on the fly. Also, avoid symbols in your URLs, especially the ? symbol. Search engines tend to choke on it. The Search Engines And Dynamic Pages article provides additional information about testing for and solving dynamic delivery problems.

Build Inbound Links
Every major search engine uses link analysis as part of its ranking algorithm. This is done because it is very difficult for webmasters to "fake" good links, in the way they might try to spam search engines by manipulating the words on their web pages. As a result, link analysis gives search engines a useful means of determining which pages are good for particular topics. By building links, you can help improve how well your pages perform in link analysis systems. The key is understanding that link analysis is not about "popularity." In other words, it's not an issue of getting lots of links from anywhere. Instead, you want links from good web pages that are related to the topics you want to be found for. Here's one simple means to find those good links. Go to the major search engines. Search for your target keywords. Look at the pages that appear in the top results. Now visit those pages and ask the site owners if they will link to you. Not everyone will, especially sites that are extremely competitive with yours. However, there will be non-competitive sites that will link to you -- especially if you offer to link back. Why is this system good? By searching for your target keywords, you'll find the pages that the search engines deem authoritative, evidenced by the fact that they rank well. Hence, links from these pages are more important (and important for the terms you are interested in) than links from other pages. In addition, if these pages are top ranked, then they are likely to be receiving many visitors. Thus, if you can gain links from them, you might receive some of the visitors who initially go to those pages. There are also other ways to attract quality links. One that has recently gained traction is linkbaiting. Linkbaiting refers to a variety of techniques used on a web site to attract links from other web sites. This can include content, online tools, downloads, or anything else that other site owners might find compelling enough to link to. The originators of linkbaiting techniques were Aaron Wall and Andy Hagans. The post on SEO Book, "101 Ways to Build Link Popularity in 2006," can give you an idea of how to use linkbaiting to attract quality links. The More About Link Analysis page provides in-depth advice on building relevant links to your web site.

Just Say No to Search Engine Spamming
For one thing, spamming doesn't always work with search engines. It can also backfire. Search engines may detect your spamming attempt and penalize or ban your page from their listings. Search engine spamming attempts usually center around being top ranked for extremely popular keywords. You can try and fight that battle against other sites, but then be prepared to spend a lot of time each week, if not each day, defending your ranking. That effort usually would be better spent on networking and alternative forms of publicity, described below. If the practical reasons aren't enough, how about some ethical ones? The content of most web pages ought to be enough for search engines to determine relevancy without webmasters having to resort to repeating keywords for no reason other than to try and "beat" other web pages. The stakes will simply keep rising, and users will also begin to hate sites that undertake these measures. Consider search engine spamming against spam email. No one likes spam email, and sites that use spam email services often face a backlash from those on the receiving end. Sites that spam search engines degrade the value of search engine listings. As the problem has grown, these sites now face the same backlash that spam email generates.

Submit Your Key Pages
Most search engines will index the other pages from your web site by following links from a page you submit to them. But sometimes they miss, so it's good to submit the top two or three pages that best summarize your web site. Don't trust the submission process to automated programs and services. Some of them are excellent, but the major search engines are too important. There aren't that many. Submit manually, so that you can see if there are any problems reported. Also, don't bother submitting more than the top two or three pages. It doesn't speed up the process to submit more. Submitting alternative pages is only insurance. In case the search engine has trouble reaching one of the pages, you've covered yourself by giving it another page from which to begin its crawl of your site. Be patient. It can take up to a month to two months for your "non-submitted" pages to appear in a search engine. Additionally, some search engines may not list every page from your site.

Verify and Maintain Your Listing
Check on your pages and ensure they get listed, in the ways described on the Check URL page. Once your pages are listed in a search engine, monitor your listing every week or two. Strange things happen. Pages disappear from catalogs. Links go screwy. Watch for trouble, and resubmit if you spot problems. Resubmit your site any time you make significant changes. Search engines should revisit on a regular schedule. However, some search engines have grown smart enough to realize some sites only change content once or twice a year, so they may visit less often. Resubmitting after major changes will help ensure that your site's content is kept current.

Beyond Search Engines
It's worth taking the time to make your site more search engine friendly because some simple changes may pay off with big results. Even if you don't come up in the top ten for your target keywords, you may find an improvement for target keywords you aren't anticipating. The addition of just one extra word can suddenly make a site appear more relevant, and it can be impossible to guess what that word will be. Also, remember that while search engines are a primary way people look for web sites, they are not the only way. People also find sites through word-of-mouth, traditional advertising, traditional media, blog posts, web directories, and links from other sites. Since the advent of Web 2.0 applications, people are finding sites through feeds, blogs, podcasts, vlogs and many other means. Sometimes, these alternative forms can be more effective draws than search engines. The most effective marketing strategy is to combine search marketing with other online and offline media. Finally, know when it's time to call it quits. A few changes may be enough to achieve top rankings in one or two search engines. But that's not enough for some people, and they will invest days creating special pages and changing their sites to try and do better. This time could usually be put to better use pursuing non-search engine publicity methods. Don't obsess over your ranking. Even if you follow every tip and find no improvement, you still have gained something. You will know that search engines are not the way you'll be attracting traffic. You can concentrate your efforts in more productive areas, rather than wasting your valuable time.

How Search Engines Rank Web Pages

Search for anything using your favorite crawler-based search engine. Nearly instantly, the search engine will sort through the millions of pages it knows about and present you with ones that match your topic. The matches will even be ranked, so that the most relevant ones come first.
Of course, the search engines don't always get it right. Non-relevant pages make it through, and sometimes it may take a little more digging to find what you are looking for. But, by and large, search engines do an amazing job. As WebCrawler founder Brian Pinkerton puts it, "Imagine walking up to a librarian and saying, 'travel.' They’re going to look at you with a blank face." OK -- a librarian's not really going to stare at you with a vacant expression. Instead, they're going to ask you questions to better understand what you are looking for.
Unfortunately, search engines don't have the ability to ask a few questions to focus your search, as a librarian can. They also can't rely on judgment and past experience to rank web pages, in the way humans can. So, how do crawler-based search engines go about determining relevancy, when confronted with hundreds of millions of web pages to sort through? They follow a set of rules, known as an algorithm. Exactly how a particular search engine's algorithm works is a closely-kept trade secret. However, all major search engines follow the general rules below

Location, Location, Location...and Frequency
One of the the main rules in a ranking algorithm involves the location and frequency of keywords on a web page. Call it the location/frequency method, for short. Remember the librarian mentioned above? They need to find books to match your request of "travel," so it makes sense that they first look at books with travel in the title. Search engines operate the same way. Pages with the search terms appearing in the HTML title tag are often assumed to be more relevant than others to the topic. Search engines will also check to see if the search keywords appear near the top of a web page, such as in the headline or in the first few paragraphs of text. They assume that any page relevant to the topic will mention those words right from the beginning. Frequency is the other major factor in how search engines determine relevancy. A search engine will analyze how often keywords appear in relation to other words in a web page. Those with a higher frequency are often deemed more relevant than other web pages.

Spice In The Recipe
Now it's time to qualify the location/frequency method described above. All the major search engines follow it to some degree, in the same way cooks may follow a standard chili recipe. But cooks like to add their own secret ingredients. In the same way, search engines add spice to the location/frequency method. Nobody does it exactly the same, which is one reason why the same search on different search engines produces different results. To begin with, some search engines index more web pages than others. Some search engines also index web pages more often than others. The result is that no search engine has the exact same collection of web pages to search through. That naturally produces differences, when comparing their results. Search engines may also penalize pages or exclude them from the index, if they detect search engine "spamming." An example is when a word is repeated hundreds of times on a page, to increase the frequency and propel the page higher in the listings. Search engines watch for common spamming methods in a variety of ways, including following up on complaints from their users.

Off The Page Factors
Crawler-based search engines have plenty of experience now with webmasters who constantly rewrite their web pages in an attempt to gain better rankings. Some sophisticated webmasters may even go to great lengths to "reverse engineer" the location/frequency systems used by a particular search engine. Because of this, all major search engines now also make use of "off the page" ranking criteria. Off the page factors are those that a webmasters cannot easily influence. Chief among these is link analysis. By analyzing how pages link to each other, a search engine can both determine what a page is about and whether that page is deemed to be "important" and thus deserving of a ranking boost. In addition, sophisticated techniques are used to screen out attempts by webmasters to build "artificial" links designed to boost their rankings. Another off the page factor is clickthrough measurement. In short, this means that a search engine may watch what results someone selects for a particular search, then eventually drop high-ranking pages that aren't attracting clicks, while promoting lower-ranking pages that do pull in visitors. As with link analysis, systems are used to compensate for artificial links generated by eager webmasters.

How Search Engines Work

The term "search engine" is often used generically to describe both crawler-based search engines and human-powered directories. These two types of search engines gather their listings in radically different ways.
Crawler-Based Search Engines
Crawler-based search engines, such as Google, create their listings automatically. They "crawl" or "spider" the web, then people search through what they have found. If you change your web pages, crawler-based search engines eventually find these changes, and that can affect how you are listed. Page titles, body copy and other elements all play a role.
Human-Powered Directories
A human-powered directory, such as the Open Directory, depends on humans for its listings. You submit a short description to the directory for your entire site, or editors write one for sites they review. A search looks for matches only in the descriptions submitted. Changing your web pages has no effect on your listing. Things that are useful for improving a listing with a search engine have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed for free than a poor site.
"Hybrid Search Engines" Or Mixed Results
In the web's early days, it used to be that a search engine either presented crawler-based results or human-powered listings. Today, it extremely common for both types of results to be presented. Usually, a hybrid search engine will favor one type of listings over another. For example, MSN Search is more likely to present human-powered listings from LookSmart. However, it does also present crawler-based results (as provided by Inktomi), especially for more obscure queries.

The Parts Of A Crawler-Based Search Engine
Crawler-based search engines have three major elements. First is the spider, also called the crawler. The spider visits a web page, reads it, and then follows links to other pages within the site. This is what it means when someone refers to a site being "spidered" or "crawled." The spider returns to the site on a regular basis, such as every month or two, to look for changes. Everything the spider finds goes into the second part of the search engine, the index. The index, sometimes called the catalog, is like a giant book containing a copy of every web page that the spider finds. If a web page changes, then this book is updated with new information. Sometimes it can take a while for new pages or changes that the spider finds to be added to the index. Thus, a web page may have been "spidered" but not yet "indexed." Until it is indexed -- added to the index -- it is not available to those searching with the search engine. Search engine software is the third part of a search engine. This is the program that sifts through the millions of pages recorded in the index to find matches to a search and rank them in order of what it believes is most relevant. You can learn more about how search engine software ranks web pages on the aptly-named How Search Engines Rank Web Pages Page.
Major Search Engines: The Same, But Different
All crawler-based search engines have the basic parts described above, but there are differences in how these parts are tuned. That is why the same search on different search engines often produces different results. Some of the significant differences between the major crawler-based search engines are summarized on the Search Engine Features Page. Information on this page has been drawn from the help pages of each search engine, along with knowledge gained from articles, reviews, books, independent research, tips from others and additional information received directly from the various search engines.

Sunday, June 24, 2012

How Search Engines Work

How Search Engines Work The term "search engine" is often used generically to describe both crawler-based search engines and human-powered directories. These two types of search engines gather their listings in radically different ways. Crawler-Based Search Engines Crawler-based search engines, such as Google, create their listings automatically. They "crawl" or "spider" the web, then people search through what they have found. If you change your web pages, crawler-based search engines eventually find these changes, and that can affect how you are listed. Page titles, body copy and other elements all play a role. Human-Powered Directories A human-powered directory, such as the Open Directory, depends on humans for its listings. You submit a short description to the directory for your entire site, or editors write one for sites they review. A search looks for matches only in the descriptions submitted. Changing your web pages has no effect on your listing. Things that are useful for improving a listing with a search engine have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed for free than a poor site. "Hybrid Search Engines" Or Mixed Results In the web's early days, it used to be that a search engine either presented crawler-based results or human-powered listings. Today, it extremely common for both types of results to be presented. Usually, a hybrid search engine will favor one type of listings over another. For example, MSN Search is more likely to present human-powered listings from LookSmart. However, it does also present crawler-based results (as provided by Inktomi), especially for more obscure queries. The Parts Of A Crawler-Based Search Engine Crawler-based search engines have three major elements. First is the spider, also called the crawler. The spider visits a web page, reads it, and then follows links to other pages within the site. This is what it means when someone refers to a site being "spidered" or "crawled." The spider returns to the site on a regular basis, such as every month or two, to look for changes. Everything the spider finds goes into the second part of the search engine, the index. The index, sometimes called the catalog, is like a giant book containing a copy of every web page that the spider finds. If a web page changes, then this book is updated with new information. Sometimes it can take a while for new pages or changes that the spider finds to be added to the index. Thus, a web page may have been "spidered" but not yet "indexed." Until it is indexed -- added to the index -- it is not available to those searching with the search engine. Search engine software is the third part of a search engine. This is the program that sifts through the millions of pages recorded in the index to find matches to a search and rank them in order of what it believes is most relevant. You can learn more about how search engine software ranks web pages on the aptly-named How Search Engines Rank Web Pages page. Major Search Engines: The Same, But Different All crawler-based search engines have the basic parts described above, but there are differences in how these parts are tuned. That is why the same search on different search engines often produces different results. Some of the significant differences between the major crawler-based search engines are summarized on the Search Engine Features Page. Information on this page has been drawn from the help pages of each search engine, along with knowledge gained from articles, reviews, books, independent research, tips from others and additional information received directly from the various search engines. Now let's look more about how crawler-based search engine rank the listings that they gather.

Introduction to SEO

Search engines are one of the primary ways that Internet users find Web sites. That's why a Web site with good search engine listings may see a dramatic increase in traffic. Everyone wants those good listings. Unfortunately, many Web sites appear poorly in search engine rankings or may not be listed at all because they fail to consider how search engines work. In particular, submitting to search engines (as covered in the Essentials section) is only part of the challenge of getting good search engine positioning. It's also important to prepare a Web site through "search engine optimization." Search engine optimization means ensuring that your Web pages are accessible to search engines and are focused in ways that help improve the chances they will be found.

Search Engine Optimization

Search engine optimization (SEO) is the process of improving the visibility of a website or a web page in a search engine's "natural," or un-paid ("organic" or "algorithmic"), search results. In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search, local search, video search, academic search,[1] news search and industry-specific vertical search engines. As an Internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. The acronym "SEOs" can refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into website development and design. The term "search engine friendly" may be used to describe website designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure.