Mostrando entradas con la etiqueta ENGLISH. Mostrar todas las entradas
Mostrando entradas con la etiqueta ENGLISH. Mostrar todas las entradas

miércoles, 25 de junio de 2008

BILL GATE'S Frases Celebres

La bola de cristal de Bill Gates
  • Una recopilación de las mejores frases del fundador de Microsoft.
  • Abandonará la empresa este mes para dedicarse a su fundación.

Bill Gates ha estado al frente de Microsoft durante las últimas tres décadas. El próximo viernes será su último día oficial de trabajo y a partir de julio se dedicará fundamentalmente a la gestión de los proyectos de la Fundación Bill y Melinda Gates.

Tras su paso deja una empresa que ha dominado el mercado de sistemas operativos durante años y que lucha por hacerse fuerte en internet. Y una gran colección de predicciones sobre el futuro del sector tecnológico, unas más acertadas que otras, que puedes repasar en Wikiquote y PC World. Te ofrecemos una recopilación de las mejores.

"No hay en nuestro software un número significativo de errores que un número significativo de usuarios necesite ver resueltos (1995)".

"Quienes son criminales en vida real también lo serán en internet, donde la policía necesita ser un poco más sofisticada. El crimen online es solo parte de la maduración del medio (1996)".

"Los ordenadores serán mucho más baratos, tienen que bajar incluso por debajo de los 500 dólares, es algo que tiene que poder hacerse (1996)".

El futuro de la música, los ordenadores, internet o el correo basura figuran entre sus predicciones
"Dudo que el periódico sea muy diferente dentro de diez años, pero probablemente tenga más direcciones de páginas web (1997)".

"En algún momento dentro de 10 o 20 años podrás hablarle a tu ordenador y entenderá lo que le dices (1997)".

"Hay cosas que te pillan por sorpresa. Cuando apareció internet ocupó el 5º o 6º puesto en nuestra lista de prioridades. Pero llegó un momento en el que nos dimos cuenta de que era un fenómeno más profundo de lo que habíamos pensado (1998)".

"Tus clientes más insatisfechos son la mejor fuente de aprendizaje (1999)".

"Microsoft tuvo grandes competidores en el pasado, es bueno que existan museos para recordarlo (2001)".

"El spam será cosa del pasado en dos años (2004)".

"BluRay es el último soporte físico que veremos (2005)".

"Mi hija no sabe lo que es un disco. Sigo intentando encontrar uno para enseñárselo, pero es difícil en estos días. Pronto cosas como el listín telefónico o la enciclopedia impresa estarán igual de anticuadas (2008)".

SE VA GATES DE MICROSOFT


Comienza la vida sin Gates en Microsoft
  • El presidente del Microsoft deja sus responsabilidades al frente de la compañía informática tras más de 30 años de trabajo.
  • La compañía debe enfrentarse a nuevos competidores en internet y a la renovación de su sistema operativo Windows.
El presidente de Microsoft, uno de los personajes más influyentes de la industria tecnológica durante las últimas tres décadas, dejará sus responsabilidades al frente de la multinacional informática el 30 de junio.

Su fortuna está valorada en 43.790 millones de euros
Desde que fundara la empresa junto a Paul Allen, en 1976, Bill Gates ha logrado amasar una de las mayores fortunas conocidas. Recientemente fue desbancado del puesto de hombre más rico del mundo por el empresario mexicano Carlos Slim, pero su fortuna sigue siendo la segunda mayor del planeta: cerca de 43.790 millones de euros.

Gates dejará sus responsabilidades al frente de la empresa, pero no abandonará el puesto de presidente y seguirá siendo empleado de Microsoft. Oficialmente, su último dá de trabajo será el viernes.

Estos fondos los gestionará a partir de ahora desde la Fundación Bill y Melinda Gates, a la que ahora dedicará el 80% de su tiempo, dejando el 20% restante para la compañía. Su éxito, en muchos casos, se ha debido a la debilidad de sus competidores, según reconocía la semana pasada en una entrevista en la BBC británica.

El futuro de la compañía

Tras su marcha, Bill Gates dejará varios desafíos sin resolver para Microsoft, que está a punto de despedirse definitivamente de Windows XP sin que Vista se haya consolidado como una alternativa válida, sobre todo en los entornos corporativos.

Internet reúne varios desafíos, tanto en los buscadores como en la publicidad o los navegadores
Continúan también los problemas con las autoridades de la competencia, que especialmente en Europa se han mostrado muy críticas con el comportamiento monopolista de la empresa informática. La fusión fallida con Yahoo deja también a Microsoft en una posición complicada frente otros gigantes de internet, como Google.

También crecer la competencia en uno de sus tradicionales espacios de dominio, los navegadores. Internet Explorer se mantiene fuerte como el programa más utilizado pero Firefox, Opera y Safari siguen creciendo poco a poco.

viernes, 18 de enero de 2008

Campus Technology Summer Conference


Campus Technology Summer Conference
July 28–31, 2008
“Welcome to Next-Gen .EDU”

Call for Posters

Gain exposure for your work, share your project results with peers, and have in-depth discussions with attendees. Become a poster session presenter at Campus Technology 2008 in Boston, July 28–31, 2008.

Click HERE for more information and submission guidelines.

Full conference details will be available April 2008.

jueves, 4 de octubre de 2007

Sony Ericsson W580i Walkman phone launched by AT&T


Sony Ericsson and AT&T have announced the launch of the W580i. The company first announced the Walkman Phone at CTIA 2007 back in March.

The 14mm thick Sony Ericsson W580i comes packed with everything a music phone needs: Walkman player 2.0 with MP3/AAC/AAC+/e-AAC+ support, 20 hours of music playback, FM Radio with RDS and speakerphone. Other features include a 2.0 megapixel camera 12MB internal memory, 512MB card in the box and Bluetooth 20.0 with A2DP.

AT&T has started shipping the Sony Ericsson W580i for prices between $79 and $129 with a 2-year contract. This is a bit strange, but the price will depend on the where you live.

Multi-Core Technology coming to Smart-phones


Symbian and ARM Cooperate in Bringing Symmetric Multi Processing (SMP) to Future Phones Enabling High-End PC Capabilities to Consumers’ Pockets


Symbian Limited today announced Symbian OS support for the ARM Symmetric Multi-processor (SMP) architecture. SMP support in future versions of Symbian OS will use multiple CPU cores to provide ‘performance on demand’ – battery life will be improved by accessing cores only when running demanding high-end multimedia applications and powering them down when they are not in use. This announcement is a milestone in Symbian’s strategy for power efficiency for converged mobile devices, reinforcing Symbian’s position as technology leader.

Symbian and ARM are long standing partners and have successfully collaborated on technology development and product planning for over 10 years. The ARM® Cortex™-A9 MPCore™ multicore processor was announced earlier today at the ARM Developers’ Conference. Symbian and ARM are working together closely on supporting Cortex-A9 MPCore multicore processor-based CPUs in Symbian OS.

Multi-processing technology underlies next generation Cortex-A9 processor designs. In converged mobile devices, SMP CPUs consist of multiple cores which can be individually powered up and down by the operating system. This delivers high performance for high-end applications such as games, browser-based intelligent services, and media-rich applications such as video streaming or TV recording, while offering low power consumption when the device is idle or executing less performance-critical tasks. Symbian believes SMP support is a crucial step in continuing to deliver industry-leading battery life in a world where converged mobile devices offer increasingly performance-demanding features with constant battery capacity.

In order to take full advantage of SMP, Symbian is taking the following technology steps:
• multi-processor support in the Symbian OS kernel and device driver model
• targeted enhancements throughout Symbian OS
• extended Symbian OS developer tools to allow developers to access the benefits of SMP
• Symbian OS validation on Cortex-A9 based hardware and models

Symbian has already started to deliver SMP technologies to its customers and will roll out the above incremental developments in future versions of Symbian OS. Details of this will be announced in due course. The first Cortex-A9 MPCore processor-based Symbian smartphones are expected in 2010.

jueves, 20 de septiembre de 2007

Learn SEO Basics


This is our first SEO article and hopefully that it will be a great resource for people like you. Let's go over a few basic points of SEO.

Let's take a few minutes to discuss how the search engines view your sites. When the search engines come to crawl your sites the things that they see is text.

When the search engines come they suck up all the text on you site and store it away for future use. They will catalog the words on you site and use that information to decide if a search that someone does in the future would meet their search needs.

Basically they crawl your site and then they index the information on your site. So the primary thing they are looking for is change and content that is relevant to the users search. So if I search for "place here what you offer" the engine would know that the content on your site might meet that search criteria and pull your site as a relevant match.

If you found that people were searching for "Special lollypops from Mars," then you would want to have those keywords "special lollypops mars" through out the content of your webpage.

We suggest you to follow a 6 step procedure:

1. Keyword Research.
this means that instead of deciding what keyword you want to be found for, research needs to be done to determine what keywords people are actually searching for that would be relevant to your industry and expertise.This is why I think it is important for websites to have a niche market because then their websites can be optimized for people searching for that niche market.

2. Crafting content for the Human reader.
Some will try to play the game of Keyword stuffing for the search engines, but our first focus is to craft content for your potential visitors.The visitors to the site are the ones you are trying to target and convert with your content, not the search enginges.

3. Keyword Integration.
In this step we would make sure that the proper keywords are placed in your content. This would include the keyword that was researched along with any other keywords relevant to the keyword researched. There are online tools that you can use to do keyword research. They actually pull from the search engines databases to find out what people are searching for. One thing that you might consider as you research keywords is what are the needs and interests your customers or potential visitors.

http://wordtracker.com/ is one tool that I have used. That's a terrifc resource.
After you get the Keyword set into your page we would ask you to...

4. Get the pages into the search engines. There is a useful tool called Google XML sitemap. With this tool an XML sitemap would be generated for your site and google would benotified each time your site is updated. https://www.google.com/webmasters/tools
Yahoo! also supports XML sitemaps. Along with getting your site out to these broad search engines it would be best to getyour site out into what is referred to as vertical search engines. This would include search engines that are industry specific.

5. Monitor the progress of your pages through Analytics. This is an important step so that you will know who is visiting your page and where are they coming to your site from, how long are they staying on your site, etc. Google analytics is a free enhancement that you can add to your website.
Finally we recommend that you....

6. Work in Ranges of pages. This means that you work on 5-10 pages at a time in your site with related keywords. It is not enough to just optimize one page of your site but it it best to work with a cluster of pages.

Hope this helps. Thanks for reading.

viernes, 7 de septiembre de 2007

Why Aren't My Computer-Based Systems Smart Enough for My Business Already?

The computer-based systems most organizations rely on to support their businesses are not very smart. The answer is not to implement newer, “intelligent” systems. The fact is that much of today’s existing technology has the potential to be “smart enough” to make a big difference to an organization’s business.

The cost of information technology in today's organizations is substantial and is growing as a percentage of total operating cost—but today's systems still aren't smart enough. Why this is the case can only be understood by taking a historical perspective on business computing and how information technology (IT) developed as a discipline. In the past, IT departments were thought leaders in their organizations, setting the pace for the application of technology. Today, the needs and sophistication of the people who work in organizations are colored by an outside influence—the Internet—and IT organizations struggle to keep pace, burdened by accelerating demands and the drag of maintaining legacy systems. The tools, technology, and techniques for enterprise decision management (EDM) are ready, but impediments are rooted in the history of business computing and the resultant IT cultures. To understand why, after years of spending on information technology, so many of your systems aren't already smart enough, you need to take a walk back through the evolution of business computing in practice. With that in mind, this chapter is a brief look at the history of a typical IT infrastructure. It's not a general-purpose history but a brief outline of how we ended up in this mess.

How Did We Get Here?

A history of business computing would require volumes. However, a brief overview of the evolution of information technology (IT) in business decision making is useful in understanding why certain aspects of information management are still misaligned. Certain approaches that are taken for granted today are based on situations that disappeared long ago; certain disciplines that are considered unrelated are actually quite similar and synergistic. Some approaches aren't being used in a comprehensive enough way. Only through understanding the basis of these inefficiencies and, often, dysfunctions can you construct a rational remedy.

Computing in business is roughly 60 years old, although the deployment of computers for processing business information didn't start to catch on until the late 1950s. There were many computer manufacturers at that time, and each one developed its own proprietary operating systems and application software, although software was usually limited to a language compiler or two. Some manufacturers developed a reputation for general-purpose computing based on their programming languages (Assembler originally, but expanding to higher-level languages such as COBOL and FORTRAN); others came to specialize in certain applications, such as back-office work for banks (Burroughs, for example). At the time, general-purpose computing was what was needed for computing to take hold across industries.

Because general-purpose computing took hold, subsequent development of the technology proceeded rapidly, which created a constant lag between optimistic expectations and delayed delivery of useful output. Applying technology upgrades and innovations took considerably longer than producing them, a phenomenon that has only gotten worse over time. In the early days, the delivery of a new model meant reprogramming everything, because there was no upward compatibility. Today, that amount of effort seems almost ridiculous, yet according to a survey by Forrester Research, 75 percent of the IT budget is spent on maintenance of existing software and infrastructure. Organizations' ability to absorb new technology and put it to work is just as constrained today as it was at the dawn of computing. Unlike many other industries, the rate of change has neither slowed nor been adapted to.

People responsible for computers have been a unique breed in business. Their skills with abstract concepts or mysterious codes are in stark contrast to the formal, businesslike demeanor of accountants, salespeople, and executives. From the beginning, they were a separate group, in a career that had little or no trajectory into the "real" parts of business. This chasm between IT and the rest of the organization exists to this day and is still a major cause of dissonance between IT efforts and business requirements. In fact, in the past, IT managers had all they could do to manage IT software, hardware, and staff. They had no time to worry about the business itself, which was left to business professionals. Today, the stereotypes of the computer wizard in the basement with the pocket protector and the florid, cigar-smoking CEO are cartoonish and dated, but in reality, the gap is still severe..

miércoles, 29 de agosto de 2007

The Future of the Workplace: No Office, Headquarters in Cyberspace

I find this article , and i think is very interesting that we can read it.

Extract From http://abcnews.go.com/WN/story?id=3521725&page=1

Some Companies Don't Care Where Workers Are as Long as They Get the Job Done

By BETSY STARK
ABC NEWS Business Correspondent

Aug. 27, 2007

Imagine a work world with no commute, no corporate headquarters and perhaps not even an office in the physical world at all.

For Bob Flavin, a computer scientist at IBM; Janet Hoffman, an executive at a management consulting firm; and Joseph Jaffe, a marketing entrepreneur, the future is already here.


Watch the full report from Betsy Stark tonight on "World News with Charles Gibson" at 6:30 p.m. EDT.

"These days we do so much by teleconference it really doesn't matter where you are," Flavin said.

Like 42 percent of IBM's 350,000 employees, Flavin rarely comes in to an IBM office.

"We don't care where and how you get your work done," said Dan Pelino, general manager of IBM's global health care and life sciences business. "We care that you get your work done."

IBM says it saves $100 million a year in real estate costs because it doesn't need the offices.

Head to Work, in Cyberspace

On the day we met Flavin, he was collaborating with computer scientists in British Columbia and Beijing from the on-call room of the local ambulance corps where he works as a volunteer.

The work force at the Accenture management consulting firm is so mobile not even the CEO has an office with his name on the door.

With no corporate headquarters, if you need a work space, you reserve it like a hotel room — checking in and out at a kiosk.

"Having a big desk as a sign of status with lots of family photos and you know, carpeting that's fluffy and nice, that is a vision of the past," said Hoffman, executive vice president of Accenture.

In the future, more companies with scattered work forces and clients may do what the marketing firm Crayon is doing: making its headquarters in cyberspace.

Crayon's workers rarely meet in the physical world — some are in Boston, others are in Nutley, N.J. — but their online alter egos in the virtual world gather once a week.

We never met Crayon's CEO in person but we spent a couple of hours together in cyberspace.

"Our belief is if we bring like minds together no matter where they are in the world we can actually create that connectedness as if we're actually at the same place at the same time," said Jaffe, Crayon's CEO.


Maintaining a community is essential at IBM, where Pelino said isolation is a "significant" issue. There's even a joke at the company that the name stands for "I'm by Myself."

"The casual meeting of colleagues in the cafeteria or at the water cooler is actually quite valuable and something you find you eventually miss when working at home," Flavin said. "We actually have to deliberately schedule some common lunches to make sure that we keep in contact." The company has also started to organize spirit clubs to foster a community. As Pelino put it, "You have to create these types of venues where you bring people together and magical things start to happen."

jueves, 2 de agosto de 2007

Getting the Models Right': How to Value Hard-to-Price Assets


What are things worth?

The answer seems simple enough. Just look around the marketplace to see what similar items are selling for. But what if your house has a pool, while the one that sold next door doesn’t? Unless you are dealing with an item with exact duplicates that are bought and sold every day, like stock in a publicly traded company, it’s hard to know just what your item is worth.

It’s a devilish problem in the business world, where companies need to account for the fast-changing values of complex financial instruments — from insurance policies to employee stock options to exotic derivatives — for which there is no ready sales history. Yet accounting standards are tightening, requiring that businesses justify valuations rather than simply use their best guess or original purchase price, as they did in the past. So firms are turning to ever more complicated financial models that attempt to deduce values using an array of indicators.

“What you’re trying to figure out is: What if you had to sell [an asset] in the market? What would somebody be willing to pay?” said Wharton finance professor Richard J. Herring. “People are trading on the basis of these [models], but it is difficult, because they are extremely complex, and regulators are worried that they can be pretty easily manipulated.”

This dilemma was the topic of the Tenth Annual Wharton/Oliver Wyman Institute Risk Roundtable held May 31-June 1 and sponsored by The Wharton Financial Institutions Center and Oliver Wyman Institute. The Roundtable was hosted by Herring, the Center’s co-director.

International and U.S. accounting bodies are strengthening rules on how to place “fair value” on hard-to-price assets. Last September, for example, the Financial Accounting Standards Board (FASB) in the U.S. adopted Statement 157 which requires that, whenever possible, companies rely on market data rather than their own internal assumptions to value assets.

But some critics argue that computerized valuation models rely on assumptions so uncertain that the results should merely be noted in financial statements rather than included in tallies of assets and liabilities, as FASB requires. The new rules take effect with financial statements for fiscal years beginning after November 15, 2007. “Fair values are unverifiable.... Any model is an opinion embodying many judgments,” said critic Mark Carey, finance project manager for the Federal Reserve Board, during remarks at the conference.

While conceding that the Fed had “lost the battle” to minimize use of fair value accounting, he warned that allowing firms to set up their own valuation models, rather than relying on standardized ones, invites trouble. “The problem is fraud,” he noted. “The reason the Fed is concerned about this is because we are worried about the state of a world in which a firm wants to conceal its insolvency. That’s fairly easy to do in a fair value system.”

Insuring Against Catastrophe

Insurance is one field that is using more elaborate models to calculate risks, set policy prices and figure the current value of policies issued in the past, according to panelist Jay Fishman, chairman and CEO of The Travelers Companies. “Catastrophe modeling,” for example, forecasts the likelihood of earthquakes, terrorism and other events that result in claims.

In his presentation, “Insuring against Catastrophes: The Central Role of Models,” Fishman noted that insurers previously assessed catastrophe risks by analyzing past events. Typically, they figured average hurricane losses on a statewide basis, not accounting for greater damage in coastal areas and failing to properly estimate the greater damage an unusually large hurricane could cause. Before Hurricane Andrew struck the U.S. in 1992, the most damaging hurricane was Hugo in 1989. Hugo cost insurers $6.8 billion, while Andrew cost them $22 billion and left a dozen insurers insolvent.

New catastrophe models are far more complex, Fishman said, because they add data on likely storm paths predicted by scientists; the types of construction, ages and heights of buildings along those paths; the value of insurance issued; policy limits; deductibles, and other factors bearing on losses. In addition, insurers now consider changes in the frequency of big storms caused by factors like rising sea temperatures from global warming.

With guidance from these more sophisticated models, Travelers has raised deductibles for wind damage, tightened its coverage for business interruption and changed premiums to reflect a better understanding of risk, according to Fishman, who adds, however, that models have limits. They are not good, for example, at accounting for long cycles in weather patterns, nor can they forecast claims when events are bigger than expected. Hurricane Katrina, for example, caused more damage inland than the models had forecast, he said.

Softening the Jolts

Similar shortcomings are found in models used in other industries, causing debate about how models should be constructed. Financial institutions have trouble, for example, tracking daily changes in values of credit default swaps, collateralized mortgage obligations, over-the-counter options, thinly traded bonds and other securities for which there is no liquid, transparent market.

It’s not uncommon, said Herring, for a large financial institution to have 2000 valuation models for different instruments. And the penalties for getting the results wrong can be severe, as investors learned in the Enron and Long-Term Capital Management debacles, or with the recent financial restatements by Fannie Mae.

The problem has recently been highlighted by the fallout from the subprime mortgage lending binge of the past few years. These loans typically were bundled together and sold to investors as a form of bond. Now, rising interest rates increase the likelihood that some homeowners will fall behind on their payments, undermining the bonds’ values. But the models cannot account for these factors very well because subprime mortgages are so new that there is little historical data. Amidst this uncertainty, financial institutions are hustling to protect themselves, and consumers may find it harder to get loans as a result. Better modeling could soften these jolts.

Though valuation models must be customized for every instrument, they should share some underlying principles, said Thomas J. Linsmeier, a FASB member, noting that the goal of Statement 157 is to arrive at a price that would be received if the asset were sold in an “orderly transaction” — in other words, not in a crisis or “fire sale.”

Many financial assets are so highly customized that there are no comparable sales. Even when there are, many sales are private transactions that do not produce data for others to use as examples, he said. In these cases, the asset’s owner should try to determine what should be considered the “principal market” in which the asset would be bought and sold, so that data from smaller, less representative markets can be screened out to reduce confusion. “For many financial instruments there are many, many markets in which you might exchange those items...,” he noted. “If there is a principal market, let’s use that ... rather than using all possible markets.”

When there is no data on sales of comparable assets, firms should turn to market prices for similar assets, Linsmeier suggested. When that is not available either, firms must rely on their own internal estimates. But those should be based on the same assumptions an outside buyer would use, rather than on the firm’s own assumptions, which might be biased to make its accounts look better, he said, adding that, generally, any data obtained from the marketplace is preferred over internal company estimates.

Biases and Stock Options

The problem of internal firm biases influencing accounting is illustrated by the recent debate over whether companies should count stock options issued to executives and other employees as an expense.

While economists generally agreed that options are a cost of business that should be counted as an expense, many business groups opposed the move, noted Chester Spatt, chief economist at the Securities and Exchange Commission. Expensing opponents argued it was not possible to accurately value options years before they could be exercised, because their future value would depend on the company’s stock price at the time.

“It seems surprising that companies that apparently don’t understand the cost of a compensation tool would be inclined to use it to such an extent,” Spatt said, suggesting that companies do, in fact, know the value of their options grants but don’t want to reveal the cost to shareholders who might think executives are overpaid. Proper accounting would discourage companies from issuing too many options, he noted.

Markets have long used modeling to place present values on assets whose future values will fluctuate with market conditions, Spatt added. Traders, for example, use models to value collateralized mortgage obligations whose future value will depend on changing interest rates and homeowners’ default rates.

Though modeling has been around for many years and appears to be getting better, even those who design models concede they have flaws. “I think there is a lot more need for research and discussion of approaches for measuring model risk,” said panelist Darryll Hendricks, managing director and global head of quantitative risk control for UBS Investment Bank. Oftentimes, assumptions used in models turn out wrong, he pointed out. A common model input for valuing stock options, for example, is the expected price volatility of the stock. But future volatility may be very different from the past patterns used in the assumption.

To make its models as good as possible, a firm should have a controlled, disciplined way of field testing them before introduction, and it should continually evaluate a model during the period it is used, Hendricks said. UBS discusses its models’ performance during monthly meetings among the traders who use them.

While modeling will continue to be controversial, Herring thinks it will keep getting better. He predicts firms will increasingly share data on their proprietary models, and he thinks model users will gradually adopt better standards for validating their models — making sure, for example, that evaluations are done by disinterested outsiders rather than the model designers themselves.

Advances in computing power and financial analysis have led to a mushrooming of new financial products in recent years, and should also help to improve the modeling used to measure those products’ values, Herring noted. “All of this has made it possible to produce these new products and models. But it also means a lot more is riding on getting the models right.”

Knowledge@WhartonThis article is provided courtesy of Knowledge@Wharton.

miércoles, 18 de julio de 2007

Ten Tips for Avoiding Google Hell

Have you ever wondered just what makes a website disappear from the search engines? Maybe one day your website is ranking well, and then a few weeks later it's like someone hit a switch and sent your site into the nether regions of the search engine rankings. Perhaps your site has never even managed to make it into the rankings at all? Welcome to the world of Search Engine Marketing — in particular, Search Engine Optimization (commonly referred to as SEO).

In a nutshell, SEO is the practice of optimizing a website's pages to ensure that they are search-engine friendly and that they target their intended key audiences. Each of the major search engines (Google, Yahoo!, MSN, and Ask) sends out "spiders" or "crawlers." These are programs that come to your website and "collect" all the information about each web page and take it back to the search engine's database. The search engine houses the information about the pages it collected and then presents it when a query from a searcher matches what your page is about. Each search engine applies its own algorithm to determine a page's relevancy to the searcher's query, and returns a set of results.

Here are some very basic tips that webmasters, or online marketers responsible for promotion of a website, should keep in mind. These fundamental strategies can help overcome many issues that webmasters and marketers encounter when dealing with search engines — issues that can inhibit a site from ranking, and prevent pages from being crawled or even push them into the supplemental index.



1. Use the robots.txt File

The robots.txt file is probably one of the easiest tactics webmasters can employ. Using it can help ensure that the right folders are being found and crawled by the search engines. Use of robots.txt can also assist webmasters in keeping the search engine spiders out of places they shouldn't be in. This simple text file is placed in the root directory of your website and basically acts as a "roadmap" for the search engine spiders to follow when accessing your site. You can use specific commands to allow or block folders from being crawled by all search engines spiders or a select few.

To create a robots.txt file, just use your notepad program, and follow the standard specifications for setting up instructions for the search engine spiders to follow. To get help understanding how to set up a robots.txt file, take a look at Robotstxt.org, a great resource for listing out specific robots and the syntax needed for excluding robots and folders. Referring to the specifications at Robotstxt.org will also help you avoid the pitfalls of possibly blocking folders or allowing the wrong folders to be indexed by the search engines.



2. Validate Your Website

Two of the four major search engines, Google and Yahoo!, allow and encourage webmasters to validate their websites. This is a simple task that involves uploading files to the root of your website directory, or placing information within the website's meta tags.

When a website is validated, it helps to build a level of trust with the search engine. Validation also allows the search engine to know that this indeed is a current, truly valid and functioning website.

Yahoo! allows you to sign up for their SiteExplorer service (login required) and monitor your site's authentication. Blog feeds can also be submitted if the website has a blog. And Yahoo! will notify the email connected with the account when any actions are taken on the authenticated websites connected to the account.

Google's Webmaster Central is a great resource that allows webmasters to validate their websites, and offers a lot of other tools for monitoring a website's indexing with Google. Signing up is free. Once a website is validated through the Webmaster Central tool, a webmaster can monitor and control how often a Google search engine spider comes to "crawl" the website. Webmasters can also monitor links coming into the site, high-ranking keywords for the site, and most-clicked-on query results.

Both Yahoo! Site Explorer and Google's Webmaster Central allow webmasters to submit sitemaps — the subject of our next tip.

3. Use Sitemaps

Sitemaps work similarly to the robots.txt file, in that they act as a roadmap for search engines' spiders. A sitemap tells the search engines which pages of the website should be crawled. If the site is small, you can put all your page URLs into the sitemap file; however, you'll need a different approach if your site is a couple thousand pages or more.

For larger sites, you'll want to give the spiders a "guide" to follow, so make sure your higher-level category and subcategory pages are listed in the sitemap file. By including these higher-level pages in the sitemap, you let the spiders follow the links that will lead to deeper pages within your site.

A protocol for sitemaps was set up through a cooperative effort from Google, Yahoo!, and MSN. The specifications for setting up a sitemap file either through a .txt file or an .xml file can be found at the Sitemaps.org website. Following these specifications will help webmasters to ensure they are avoiding some common pitfalls. Currently, both Google and Yahoo! allow you to submit your sitemap file.

4. Acquire Relevant Links

One of the biggest factors in having a website rank for keywords or phrases is more than simply the number of links acquired coming into the website — how those links are actually worded matters immensely.

Search engines consider a link to a website as a "vote of confidence." However, the search engine really takes into account just how the link is formed. For example, links to your site that are worded as just "click here" won't be as powerful as a link worded "Buy blue widgets at Blue Widgets' Website." The value of the link is greatly enhanced by using keywords within the link, rather than just "click here."

If you have the ability to suggest or influence the formation of a link that points to your site, make sure to use the keywords you are hoping to rank for. Not only does it benefit your website; it can also help the site that is linking to you.

Webmasters and search marketers: Take heed of where you are acquiring links from. Links to your website from bad neighborhoods can negatively affect your website's relevancy to the search engine. ("Bad neighborhoods" are defined on the Webmaster Guidelines page in the foregoing link.) If the linking site seems shady or spammy, or if something just "doesn't seem right," it might be better to trust your instincts until you can thoroughly check out the site before acquiring the link from it.

5. Don't Sell Links

Selling links on a website is a fairly controversial subject. On one hand, webmasters should be allowed to do what they want with their websites. On the other hand, the search engines view this tactic as high-risk behavior if the links are not denoted as "paid" in some way. Then there is the whole issue of just what is considered a payment.

If you are selling links on your page, the search engines suggest that you tag the link with the rel=nofollow attribute. This basically tells the search engines that you are not "passing confidence" to the site you are linking to. But this action essentially defeats the whole purpose of selling links on a website, and therein lies the controversy.

Google's quality guidelines state, "Don't participate in link schemes designed to increase your site's ranking or PageRank." If your website sells links for this purpose, and it is discovered by the search engines, your website may receive some sort of penalty that will affect the site's ranking. And in Google, this could possibly affect the site's PageRank.

6. Eliminate Duplicate Content

Search engines want to display the most relevant web pages in response to a searcher's query. Therefore, when pages have identical or very similar content, some of those pages will likely be disregarded by the search engine. Some websites don't even realize they're creating duplicate pages because of the use of session IDs, or multiple variables in a URL taking site visitors to the same page. These are just a few ways duplicate content can be created and cause a substantial headache for marketers and webmasters alike.

Google states in its webmaster guidelines:

Be aware of the following other ways of accidentally creating duplicate content:

  • "Printer-friendly" pages
  • Manufacturers' product descriptions that are reused by other retailers
  • Mirrored websites
  • Syndicated articles

All of these situations can get a web page ranked lower in the search engine, or possibly get it pushed into Google's supplemental index — neither of which you want to happen for any website. There are some simple ways to avoid this, such as blocking folders from the search engine spiders with the Robots.txt file (see tip #1).

7. Use Correct Redirects

Redirects are generally used by webmasters to take visitors who land on old, out-of-date web pages (usually from prior bookmarks) to the pages that are most current within the website's new structure. There are two types of redirects: a Temporary Redirect, technically known as a 302, and a Permanent Redirect, technically known as a 301.

Using the wrong kind of redirect when a web page is being moved can cause the page to lose its rank within the search engine. If a web page is being moved permanently to a new URL, the Permanent Redirect/301 should be used.

A Permanent Redirect tells the search engine to pass all the old page's "juice" onto the new page. This means all the value acquired by links that still point to the old page will be passed onto the new (redirected) page. If a Temporary Redirect/302 is used, no value is passed onto the new page.

8. Simplify URL Parameters

Webmasters of sites that dynamically generate content based on variables in a URL should be cognizant that search engines might not be capable of crawling some web pages. The spiders/crawlers sent out by the search engines are very "childlike" in nature. Should they run into any "roadblocks," they will stop their crawl.

Having too many parameters in your URL is one of the roadblocks that can stop crawlers from going deeper. In general terms, most search engines have a difficult time crawling URLs with more than four parameters.

If having too many parameters is an issue with your website, it might be wise to discuss with your tech team some alternatives for passing the variables to the server to generate the dynamic pages. Tools such as ISAPI_Rewrite can be a powerful assist in efforts to resolve issues with variables.

9. Customize Title Tags and Meta Descriptions

One of the most important pieces in optimizing a web page is the page title tag: It tells search engines what the page is all about. A lot of relevancy is placed in a page's title tag, so careful crafting of that tag will not only reflect a company's position but also capitalize on the use of keywords and phrases.

Unless your company is a household name like Coca-Cola or Toyota or Ritz® crackers, it's unlikely that people will be searching on your company's name to find the services or products on your website. Keeping this in mind, it's clear that you should avoid title tags like "XYZ Company Services" or "About Us — XYZ Company."

Along with the title tag, the page's meta description is an important piece of the marketing puzzle. The meta description has little value in determining relevancy to a search engine, but it is displayed when a search engine displays its results page, right below the page's title tag. Customizing both the page's title tag and meta description to focus on the actual content of the page goes a long way toward getting searches to click on your listing in the search results.

10. Request Reinclusion (Google)

Let's say you've discovered that your website has been removed from Google's index, or that a penalty has been imposed on your website's pages. What's a webmaster or search marketer to do? Are you banned or penalized forever?

Thankfully, Google can be a forgiving search engine. If you address and fix the issues that Google has identified as the reason for the banning or penalty, you then have recourse: the Reinclusion Request. Google offers you the ability to submit this request to have your website reevaluated to verify whether the offending practices Google found have been "cleaned up."

The one thing the reinclusion request won't do is guarantee that Google will include your website in the top 10 when a search is done on keywords or phrases.

Get More Help

There are a few other issues that can give webmasters and search marketers some hassles when it comes to getting their websites to rank in the search engines. In general, though, we've covered some common problems that can be resolved rather easily. If your site is experiencing any search ranking issues that are particularly difficult to work out, don't be afraid to join the webmaster users groups on both Google and Yahoo! — both groups are friendly and helpful. You'll also find that employees from these search engines will lend a helping hand when they can.

Google and Yahoo! are currently the only two engines that have dedicated actual resources and created programs to communicate with webmasters and site owners. Both MSN and Ask are currently in the process of developing similar tools. MSN does have its own set of tools called AdLabs, but these tools are highly geared toward pay-per-click advertising.

So if you ever find yourself in "search engine hell," stop and take a look at the ten situations listed in this article and compare them to what's going on with your website. Hopefully there's a nugget or two of information here that will help you climb out of trouble and into higher rankings!

Thanks to informit team for this beautiful article .

I hope that you enjoy this article and make helpful with this.

Búsqueda personalizada