viernes, 3 de agosto de 2007

Verizon presenta la calculadora de colaboración


Las organizaciones podrán medir cómo sus miembros trabajan juntos y conocer como mejorar la colaboración, usando la nueva “Calculadora de Colaboración" presentada por Verizon Business.

Verizon Business anuncia su nueva herramienta basada en Web, desarrollada por Frost & Sullivan, que está diseñada para medir rápida y fácilmente la eficacia de sus iniciativas existentes de colaboración y determinar los próximos pasos a seguir para mejorar el trabajo en equipo, con el fin de conseguir los objetivos empresariales comunes y aumentar el rendimiento.

“Hoy en día, en un mercado global altamente competitivo, la colaboración eficiente puede ser un diferenciador estratégico de competitividad", dijo Nancy Gofus, Vicepresidente Senior y Directora General de Marketing de Verizon Business. “En nuestro estudio ‘Meetings Around the World’ del año pasado se demostró la correlación directa entre colaboración y rendimiento empresarial. Este año, estamos desarrollando los mecanismos para nuestros clientes a partir de este informe para ayudarles a determinar la mejor manera posible de utilizar las tecnologías de colaboración para aumentar su propia ventaja competitiva".

La Calculadora de Colaboración considera múltiples aspectos de las organizaciones empresariales y gubernamentales, incluyendo el mercado y el número de empleados, la cultura de colaboración de una organización, la estructura y el uso de las tecnologías de colaboración —por ejemplo, mensajería instantánea, conferencia de audio, videoconferencia, conferencia a través de la web y planificación de reuniones— aspectos que puntúan en el Índice de Colaboración. De acuerdo con estos aspectos, la Calculadora de Colaboración genera una puntuación en el Índice de Colaboración de una organización, que luego se compara con la puntuación total del índice del estudio Meetings Around the World, y, si fuera aplicable, con las compañías de la región, del sector del mercado y de tamaños similares.

A los participantes, se le asigna un ranking que puede ser: por encima de la media, en la media o por debajo de la media. La Calculadora también puede identificar dificultades en el uso de la tecnología y establecer las mejores estrategias para solucionarlas.

Los spammers se aprovechan de los estrenos cinematográficos


Acontecimientos como el estreno de la película de los Simpsons se convierten en nuevas vías de ataque para los spammers.

Una vez más, los spammers aprovechan eventos como la exitosa película de los Simpsons para bombardear con correo publicitario a los usuarios de correo electrónico con el fin de obtener sus direcciones de correo para seguir enviando spam.

Según Martin Thorborg, copropietario y cofundador de SPAMfighter, “es normal que los spammers se aprovechen de acontecimientos de estas características, como estrenos de cine o noticias de gran repercusión mediática para estafar a los usuarios de correo electrónico. Está en nuestras manos evitar que los usuarios se conviertan en víctimas con este tipo de estafas filtrando estos correos electrónicos".

En este caso, el correo spam contiene una imagen de Homer Simpson en ropa interior que pregunta “¿piensa Ud. ver la película?". El usuario al hacer click en el link deja registrada la dirección de correo electrónico y de esta manera continúa recibiendo correo spam.

Para llamar la atención de los usuarios, para que entren y lean el correo, los creadores de correo basura les ofrecen un premio. Como sucede con otras ofertas publicitarias el usuario no consigue ningún premio porque éste no existe. El objetivo de los spammers es conseguir las direcciones de correo para seguir enviando más correos spam.

No es la primera vez que los spammers utilizan esta táctica como estafa, ya que ocurrió lo mismo con películas como “Harry Potter" o “Piratas del Caribe" entre otras.

BitDefender inaugura nuevo foro de seguridad


Alojado y accesible en la web de la compañía, el foro ha sido creado como vía de comunicación interactiva sobre temas de seguridad.

BitDefender anuncia la inauguración de su nuevo foro en español, el que será un lugar de encuentro dentro de la comunidad BitDefender donde intercambiar opiniones e información.

A través de esta herramienta de comunicación interactiva se podrá opinar sobre la gama de productos de la marca, proponer nuevas características para futuras versiones y compartir información de utilidad entre usuarios, entre otras posibilidades.

El foro se divide en cuatro salas con diferentes temas de discusión: general, virus y malware, spam y phishing, y productos; apoyado siempre por el departamento de soporte técnico de BitDefender para contestar y ayudar ante cualquier duda o necesidad planteada.

jueves, 2 de agosto de 2007

Getting the Models Right': How to Value Hard-to-Price Assets


What are things worth?

The answer seems simple enough. Just look around the marketplace to see what similar items are selling for. But what if your house has a pool, while the one that sold next door doesn’t? Unless you are dealing with an item with exact duplicates that are bought and sold every day, like stock in a publicly traded company, it’s hard to know just what your item is worth.

It’s a devilish problem in the business world, where companies need to account for the fast-changing values of complex financial instruments — from insurance policies to employee stock options to exotic derivatives — for which there is no ready sales history. Yet accounting standards are tightening, requiring that businesses justify valuations rather than simply use their best guess or original purchase price, as they did in the past. So firms are turning to ever more complicated financial models that attempt to deduce values using an array of indicators.

“What you’re trying to figure out is: What if you had to sell [an asset] in the market? What would somebody be willing to pay?” said Wharton finance professor Richard J. Herring. “People are trading on the basis of these [models], but it is difficult, because they are extremely complex, and regulators are worried that they can be pretty easily manipulated.”

This dilemma was the topic of the Tenth Annual Wharton/Oliver Wyman Institute Risk Roundtable held May 31-June 1 and sponsored by The Wharton Financial Institutions Center and Oliver Wyman Institute. The Roundtable was hosted by Herring, the Center’s co-director.

International and U.S. accounting bodies are strengthening rules on how to place “fair value” on hard-to-price assets. Last September, for example, the Financial Accounting Standards Board (FASB) in the U.S. adopted Statement 157 which requires that, whenever possible, companies rely on market data rather than their own internal assumptions to value assets.

But some critics argue that computerized valuation models rely on assumptions so uncertain that the results should merely be noted in financial statements rather than included in tallies of assets and liabilities, as FASB requires. The new rules take effect with financial statements for fiscal years beginning after November 15, 2007. “Fair values are unverifiable.... Any model is an opinion embodying many judgments,” said critic Mark Carey, finance project manager for the Federal Reserve Board, during remarks at the conference.

While conceding that the Fed had “lost the battle” to minimize use of fair value accounting, he warned that allowing firms to set up their own valuation models, rather than relying on standardized ones, invites trouble. “The problem is fraud,” he noted. “The reason the Fed is concerned about this is because we are worried about the state of a world in which a firm wants to conceal its insolvency. That’s fairly easy to do in a fair value system.”

Insuring Against Catastrophe

Insurance is one field that is using more elaborate models to calculate risks, set policy prices and figure the current value of policies issued in the past, according to panelist Jay Fishman, chairman and CEO of The Travelers Companies. “Catastrophe modeling,” for example, forecasts the likelihood of earthquakes, terrorism and other events that result in claims.

In his presentation, “Insuring against Catastrophes: The Central Role of Models,” Fishman noted that insurers previously assessed catastrophe risks by analyzing past events. Typically, they figured average hurricane losses on a statewide basis, not accounting for greater damage in coastal areas and failing to properly estimate the greater damage an unusually large hurricane could cause. Before Hurricane Andrew struck the U.S. in 1992, the most damaging hurricane was Hugo in 1989. Hugo cost insurers $6.8 billion, while Andrew cost them $22 billion and left a dozen insurers insolvent.

New catastrophe models are far more complex, Fishman said, because they add data on likely storm paths predicted by scientists; the types of construction, ages and heights of buildings along those paths; the value of insurance issued; policy limits; deductibles, and other factors bearing on losses. In addition, insurers now consider changes in the frequency of big storms caused by factors like rising sea temperatures from global warming.

With guidance from these more sophisticated models, Travelers has raised deductibles for wind damage, tightened its coverage for business interruption and changed premiums to reflect a better understanding of risk, according to Fishman, who adds, however, that models have limits. They are not good, for example, at accounting for long cycles in weather patterns, nor can they forecast claims when events are bigger than expected. Hurricane Katrina, for example, caused more damage inland than the models had forecast, he said.

Softening the Jolts

Similar shortcomings are found in models used in other industries, causing debate about how models should be constructed. Financial institutions have trouble, for example, tracking daily changes in values of credit default swaps, collateralized mortgage obligations, over-the-counter options, thinly traded bonds and other securities for which there is no liquid, transparent market.

It’s not uncommon, said Herring, for a large financial institution to have 2000 valuation models for different instruments. And the penalties for getting the results wrong can be severe, as investors learned in the Enron and Long-Term Capital Management debacles, or with the recent financial restatements by Fannie Mae.

The problem has recently been highlighted by the fallout from the subprime mortgage lending binge of the past few years. These loans typically were bundled together and sold to investors as a form of bond. Now, rising interest rates increase the likelihood that some homeowners will fall behind on their payments, undermining the bonds’ values. But the models cannot account for these factors very well because subprime mortgages are so new that there is little historical data. Amidst this uncertainty, financial institutions are hustling to protect themselves, and consumers may find it harder to get loans as a result. Better modeling could soften these jolts.

Though valuation models must be customized for every instrument, they should share some underlying principles, said Thomas J. Linsmeier, a FASB member, noting that the goal of Statement 157 is to arrive at a price that would be received if the asset were sold in an “orderly transaction” — in other words, not in a crisis or “fire sale.”

Many financial assets are so highly customized that there are no comparable sales. Even when there are, many sales are private transactions that do not produce data for others to use as examples, he said. In these cases, the asset’s owner should try to determine what should be considered the “principal market” in which the asset would be bought and sold, so that data from smaller, less representative markets can be screened out to reduce confusion. “For many financial instruments there are many, many markets in which you might exchange those items...,” he noted. “If there is a principal market, let’s use that ... rather than using all possible markets.”

When there is no data on sales of comparable assets, firms should turn to market prices for similar assets, Linsmeier suggested. When that is not available either, firms must rely on their own internal estimates. But those should be based on the same assumptions an outside buyer would use, rather than on the firm’s own assumptions, which might be biased to make its accounts look better, he said, adding that, generally, any data obtained from the marketplace is preferred over internal company estimates.

Biases and Stock Options

The problem of internal firm biases influencing accounting is illustrated by the recent debate over whether companies should count stock options issued to executives and other employees as an expense.

While economists generally agreed that options are a cost of business that should be counted as an expense, many business groups opposed the move, noted Chester Spatt, chief economist at the Securities and Exchange Commission. Expensing opponents argued it was not possible to accurately value options years before they could be exercised, because their future value would depend on the company’s stock price at the time.

“It seems surprising that companies that apparently don’t understand the cost of a compensation tool would be inclined to use it to such an extent,” Spatt said, suggesting that companies do, in fact, know the value of their options grants but don’t want to reveal the cost to shareholders who might think executives are overpaid. Proper accounting would discourage companies from issuing too many options, he noted.

Markets have long used modeling to place present values on assets whose future values will fluctuate with market conditions, Spatt added. Traders, for example, use models to value collateralized mortgage obligations whose future value will depend on changing interest rates and homeowners’ default rates.

Though modeling has been around for many years and appears to be getting better, even those who design models concede they have flaws. “I think there is a lot more need for research and discussion of approaches for measuring model risk,” said panelist Darryll Hendricks, managing director and global head of quantitative risk control for UBS Investment Bank. Oftentimes, assumptions used in models turn out wrong, he pointed out. A common model input for valuing stock options, for example, is the expected price volatility of the stock. But future volatility may be very different from the past patterns used in the assumption.

To make its models as good as possible, a firm should have a controlled, disciplined way of field testing them before introduction, and it should continually evaluate a model during the period it is used, Hendricks said. UBS discusses its models’ performance during monthly meetings among the traders who use them.

While modeling will continue to be controversial, Herring thinks it will keep getting better. He predicts firms will increasingly share data on their proprietary models, and he thinks model users will gradually adopt better standards for validating their models — making sure, for example, that evaluations are done by disinterested outsiders rather than the model designers themselves.

Advances in computing power and financial analysis have led to a mushrooming of new financial products in recent years, and should also help to improve the modeling used to measure those products’ values, Herring noted. “All of this has made it possible to produce these new products and models. But it also means a lot more is riding on getting the models right.”

Knowledge@WhartonThis article is provided courtesy of Knowledge@Wharton.

SALUDOS CORDIALES

Acercandonos A nuestras primeras 1,000 visitas , gracias a ustedes queridos lectores por estar al tanto de las noticias tecnologicas y por darle tan fiel acojida al blog , hoy se cumple ya casi dos meses de haber iniciado el blog, en lo que un dia empezo como un hobbie hoy se convierte en un reto y en una responsabilidad para mi , por que se que no puedo fallarles , cada dia habran cosas nuevas , nuevas noticias y nuevas sorpresas , Gracias por estar siempre Online , y gracias por brindarme dicho apoyo.

Att.

Radhames Reyes V.

miércoles, 1 de agosto de 2007

Estados Unidos niega entrada a hacker


“Halvar Flake" no podrá participar en la conferencia Black Hat a realizarse en Las Vegas esta semana.

A su llegada a Estados Unidos, la policía de inmigración se interesó por un legajo de documentos que el hacker alemán Thomas Dullien portaba en su equipaje. Dullien fue llevado una sala de interrogatorios, donde se decidió negarle la entrada a Estados Unidos.

La causa oficial aducida por las autoridades es que Dullien había viajado al país como turista. A entender de las autoridades estadounidenses, su motivación real era trabajar, para lo que necesitaba vista de trabajo.

Dullien estaba invitado a la conferencia BlackHat, que se realiza esta semana en Las Vegas. El programa incluía una charla sobre seguridad informática a cargo del experto alemán, misma que ahora ha sido cancelada.

El hacker intenta ahora obtener un visado para Estados Unidos, esta vez como invitado. Sin embargo, la embajada estadounidense en Berlín indica que el trámite del visado podría tomar hasta seis semanas. Con ello, se desvanecen las posibilidades de que Dullien pueda participar en la conferencia de este año.

La conferencia BlackHat se realiza de forma paralela a DefCon, evento que reúne a hackers y expertos en seguridad informática provenientes de todo el mundo.

Nintendo Wii es la consola de los ricos


Yo creo que ya esto es noticia vieja pero es bueno que se enteren.

mayormente en este pais Republica Dominicana donde la pobreza abarca un porcentaje promedio , ya que aqui se ha comprobado que no hay tantos pobres como la gente piensa.

Nintendo Wii está sobrerrepresentada en los hogares de altos ingresos, a pesar que la consola es más económica que los productos equivalentes de la competencia.

Según material estadístico analizado por la consultora estadounidense Nielsen, los usuarios de la consola Wii tienen un ingreso anual superior al de los usuarios de PS3 o Xbox 360.

Las estadísticas indican que la consola de Nintendo es la preferida en los hogares con un ingreso promedio superior a los 100 000 dólares estadounidenses. Lo anterior no deja de sorprender, considerando que Wii tiene un precio mucho más inferior que PlayStation 3 (PS3) y Microsoft Xbox 360.

Las estadísticas corresponden al primer análisis realizado por Nielsen en el mercado de las consolas. En otro ámbito, la consultora indica que PlayStation 2 preside la lista con un 42,3% del tiempo usado por los jugadores frente a la consola, en tanto que Xbox representa el 17%. Las siguientes posiciones de la lista son ocupadas por Xbox 360 (8%), GameCube (5,8%), Wii (4 %) y PlayStation 3 (1,5 %).

En su estudio, Nielsen agrega que los usuarios de PS3 dedican un promedio de 83 minutos a cada sesión de juego, en tanto que los usuarios de GameCube sólo permanecen 55 minutos frente a la pantalla.

El mercado de los juegos, World of Warcraft lidera con amplio margen las preferencias. En total, los usuarios de dedican un promedio de 17 horas semanales al juego.
Búsqueda personalizada