Wednesday, December 19, 2007

Visual WebGui - On-Server,Off-Client AJAX framework

Visual WebGui is an open source rapid application development framework for graphic user interfaces of IT web applications. Visual WebGui lets you cut down development time and risk factors to the minimum, throughout the life cycle of the application, without compromising on performance, security or complexity.

Visual WebGui is the only framework that provides seamless integration to Visual Studio and the .NET framework which extends the paradigms of ASP.NET in both design-time and run-time to support WinForms development for web. Visual WebGui offering is unique and not more of the same (150 AJAX frameworks)! Visual WebGui replaces all of the ASP.NET methodologies which were designed for developing sites, with WinForms methodologies, which were designed for developing applications.

To demonstrate one of the differences, Visual WebGui is the only framework that is provided with a designer that was designed for application interfaces (WinForms designer) instead of a word documents (ASP.NET designer). This provides the developer with an extremely efficient way to design interfaces using drag and drop instead of hand coding HTML layouts.

This results in unprecedented simplicity and productivity in creating complex web IT applications.

For more infomation
click here

Thursday, December 13, 2007

URL Rewriting Using ISAPI_Rewrite

ISAPI_Rewrite is a powerful URL manipulation engine based on regular expressions. It acts mostly like Apache's mod_Rewrite, but is designed specifically for Microsoft's Internet Information Server (IIS). ISAPI_Rewrite is an ISAPI filter written in pure C/C++ so it is extremely fast. ISAPI_Rewrite gives you the freedom to go beyond the standard URL schemes and develop your own scheme.

What you can do with ISAPI_Rewrite:

Optimize your dynamic content like forums or e-stores to be indexed by a popular search engines.
*Block hot linking of your data files by other sites.
*Develop a custom authorization scheme and manage access to the static files using custom scripts and database.
*Proxy content of one site into directory on another site.
*Make your Intranet servers to be accessible from the Internet by using only one Internet server with a very flexible permissions and security options.
*Create dynamic host-header based sites using a single physical site.
*Create virtual directory structure of the site hiding physical files and extensions.

This also helps moving from one technology to another.
Return a browser-dependent content even for static files.
And many other problems could be solved with the power of the regular expression engine built into the ISAPI_Rewrite.

For more detailed information
Click Here

Points for Indexed View in TSQL

if you find that the Query Optimizer has not chosen the ideal index for best performance. Normally, if the following conditions are met, the Query Optimizer will automatically consider the use of an indexed view if one exists when running a query:

SQL Server 2000/2005 Enterprise Edition is used.
These options are set to on:
ANSI_NULLS
ANSI_PADDING
ANSI_WARNINGS
ARITHABORT
CONCAT_NULL_YIELDS_NULL
QUOTED_IDENTIFIERS


This option must be set to off:
NUMERIC_ROUNDABORT
But if you find that the index on the indexed view is not optimal, you can specify which index to use in an indexed view if you do the follow two steps:

Use the NOEXPAND hint, which tells the Query Optimizer to treat the view like a table with a clustered index, and . . .
Use the INDEX hint to specify which index to use. The syntax is:

INDEX ( index_val [ ,...n ] )

where index_val is the name or index ID of the index in the indexed view.
If you want to force a clustered index scan of the indexed view,
use: "INDEX(0)".

Try to avoid using of Views in T-SQL

While views are often convenient to use, especially for restricting users from seeing data they should not see, they aren't always good for performance. So if database performance is your goal, avoid using views (SQL Server 2000/2005 Indexed Views are another story).

Views can slow down queries for several different reasons. For example, let's look at these two SELECT statements:

SELECT * FROM table_name

SELECT * FROM view_name

Which is faster? If you test it, you will find that the first SELECT statement is faster, although the execution plan for both of them will be the same. How can that be? This is because it takes SQL Server extra work (such as looking up data in the system tables) before it can execute the view. This extra work is not part of the execution plan, so it appears that the two SELECT statements should run at the same speed, which they don't, because some of the work SQL Server is doing is hidden.

Another way views can hurt performance is when JOINs or UNIONs are used, and you don't intend to use all of the columns. This results in SQL Server performing unnecessary work (such as an unnecessary JOIN or UNION), slowing down the performance.

Views, like stored procedures, once they are run the first time, are optimized and their execution plan is stored in cache in case they need to be reused. But this is not reason enough to use a view.

Views, besides hurting performance, are not all that flexible when you are working with them. For example, they can't be changed on the fly, they can’t be used to sort data, and using them for INSERTs, UPDATEs and DELETEs is problematic. In addition, while views can be nested, this just compounds their problems, so avoid doing this.

Instead of using views, use stored procedures instead. They are much more flexible and they offer better performance.

Wednesday, December 12, 2007

SQL Server Deadlocks

Deadlocking occurs when two user processes have locks on separate objects and each process is trying to acquire a lock on the object that the other process has. When this happens, SQL Server identifies the problem and ends the deadlock by automatically choosing one process and aborting the other process, allowing the other process to continue. The aborted transaction is rolled back and an error message is sent to the user of the aborted process. Generally, the transaction that requires the least amount of overhead to rollback is the transaction that is aborted.

As you might imagine, deadlocks can use up SQL Server's resources, especially CPU power, wasting it unnecessarily.

Most well-designed applications, after receiving a deadlock message, will resubmit the aborted transaction, which most likely can now run successfully. This process, if it happens often on your server, can drag down performance. If the application has not been written to trap deadlock errors and to automatically resubmit the aborted transaction, users may very well become confused as to what is happening when they receive deadlock error messages on their computer.

Here are some tips on how to avoid deadlocking on your SQL Server:

Ensure the database design is properly normalized.
Have the application access server objects in the same order each time.
During transactions, don't allow any user input. Collect it before the transaction begins.
Avoid cursors.
Keep transactions as short as possible. One way to help accomplish this is to reduce the number of round trips between your application and SQL Server by using stored procedures or keeping transactions with a single batch. Another way of reducing the time a transaction takes to complete is to make sure you are not performing the same reads over and over again. If your application does need to read the same data more than once, cache it by storing it in a variable or an array, and then re-reading it from there, not from SQL Server.
Reduce lock time. Try to develop your application so that it grabs locks at the latest possible time, and then releases them at the very earliest time.
If appropriate, reduce lock escalation by using the ROWLOCK or PAGLOCK.
Consider using the NOLOCK hint to prevent locking if the data being locked is not modified often.
If appropriate, use as low of an isolation level as possible for the user connection running the transaction.
Consider using bound connections.
[6.5, 7.0, 2000, 2005] Updated 6-6-2005

*****

When a deadlock occurs, by default, SQL Server choose a deadlock "victim" by identifying which of the two processes will use the least amount of resources to rollback, and then returns error message 1205.

But what if you don't like default behavior? Can you change it? Yes, you can, by using the following command:

SET DEADLOCK_PRIORITY { LOW NORMAL @deadlock_var }

WHERE:

Low tells SQL Server that the current session should be the preferred deadlock victim, not the session that incurs the least amount of rollback resources. The standard deadlock error message 1205 is returned.

Normal tells SQL Server to use the default deadlock method.

@deadlock_var is a character variable specifying which deadlock method you want to use. Specify "3" for low, or "6" for normal.

This command is set a runtime for a specified user connection. [2000] Updated 9-1-2005

*****

To help identify deadlock problems, use the SQL Server Profiler's Create Trace Wizard to run the "Identify The Cause of a Deadlock" trace. This will provide you with the raw data you need to help isolate the causes of deadlocks in your databases. [7.0]

*****

To help identify which tables or stored procedures are causing deadlock problems, turn on trace flag 1204 (outputs basic trace data) or trace flag 1205 (outputs more detailed trace data).

DBCC TRACEON (3605,1204,-1)

Be sure to turn off this trace flag when you are done, as this trace can eat up SQL Server's resources unnecessarily, hurting performance. [6.5, 7.0, 2000] Updated 11-6-2006

*****

Ideally, deadlocks should be eliminated from your applications. But if you are unable to eliminate all deadlocks in your application, be sure to include program logic in your application to deal with killed deadlock transactions in a user-friendly way.

For example, let's say that two transactions are deadlocked and that SQL Server kills one of the transactions. In this case, SQL Server will raise an error message that your application needs to respond to. In most cases, you will want your application to wait a random amount of time after the deadlock in order to resubmit the killed transaction to SQL Server.

It is important that there is a random waiting period because it is possible that another contending transaction could also be waiting, and you don't want both contending transactions to wait the same amount of time and then both try to execute at the same time, causing another deadlock.

Monday, December 10, 2007

Speed Up Windows XP OS with 10 ways

These points will difinitely increase the performance. I mean it.

1. Defrag Disk to Speed Up Access to Data
One of the factors that slow the performance of the computer is disk fragmentation. When files are fragmented, the computer must search the hard disk when the file is opened to piece it back together. To speed up the response time, you should monthly run Disk Defragmenter, a Windows utility that defrags and consolidates fragmented files for quicker computer response.

* Follow Start > All Programs > Accessories > System Tools > Disk Defragmenter
* Click the drives you want to defrag and click Analyze
* Click Defragment

2. Detect and Repair Disk Errors
Over time, your hard disk develops bad sectors. Bad sectors slow down hard disk performance and sometimes make data writing difficult or even impossible. To detect and repair disk errors, Windows has a built-in tool called the Error Checking utility. It’ll search the hard disk for bad sectors and system errors and repair them for faster performance.

* Follow Start > My Computer
* In My Computer right-click the hard disk you want to scan and click Properties
* Click the Tools tab
* Click Check Now
* Select the Scan for and attempt recovery of bad sectors check box
* Click Start

3. Disable Indexing Services
Indexing Services is a little application that uses a lot of CPU. By indexing and updating lists of all the files on the computer, it helps you to do a search for something faster as it scans the index list. But if you know where your files are, you can disable this system service. It won’t do any harm to you machine, whether you search often or not very often.

* Go to Start * Click Settings
* Click Control Panel * Double-click Add/Remove Programs
* Click the Add/Remove Window Components
* Uncheck the Indexing services
* Click Next

4. Optimize Display Settings
Windows XP is a looker. But it costs you system resources that are used to display all the visual items and effects. Windows looks fine if you disable most of the settings and leave the following:
* Show shadows under menus

* Show shadows under mouse pointer
* Show translucent selection rectangle
* Use drop shadows for icons labels on the desktop
* Use visual styles on windows and buttons

5. Speedup Folder Browsing
You may have noticed that everytime you open My Computer to browse folders that there is a little delay. This is because Windows XP automatically searches for network files and printers everytime you open Windows Explorer. To fix this and to increase browsing speed, you can disable the “Automatically search for network folders and printers” option.


6. Disable Performance Counters
Windows XP has a performance monitor utility which monitors several areas of your PC’s performance. These utilities take up system resources so disabling is a good idea.

* Download and install the Extensible Performance Counter List (http://www.microsoft.com/windows2000/techinfo/reskit/tools/existing/exctrlst-o.asp)
* Then select each counter in turn in the ‘Extensible performance counters’ window and clear the ‘performance counters enabled’ checkbox at the bottom button below

7. Optimize Your Pagefile
You can optimize your pagefile. Setting a fixed size to your pagefile saves the operating system from the need to resize the pagefile.
* Right click on My Computer and select Properties

* Select the Advanced tab
* Under Performance choose the Settings button
* Select the Advanced tab again and under Virtual Memory select Change
* Highlight the drive containing your page file and make the initial Size of the file the same as the Maximum Size of the file.

Windows XP sizes the page file to about 1.5X the amount of actual physical memory by default. While this is good for systems with smaller amounts of memory (under 512MB) it is unlikely that a typical XP desktop system will ever need 1.5 X 512MB or more of virtual memory.
If you have less than 512MB of memory, leave the page file at its default size. If you have 512MB or more, change the ratio to 1:1 page file size to physical memory size. You can optimize your pagefile. Setting a fixed size to your pagefile saves the operating system from the need to resize the pagefile.

* Right click on My Computer and select Properties
* Select the Advanced tab * Under Performance choose the Settings button
* Select the Advanced tab again and under Virtual Memory select Change
* Highlight the drive containing your page file and make the initial Size of the file the same as the Maximum Size of the file.

Windows XP sizes the page file to about 1.5X the amount of actual physical memory by default. While this is good for systems with smaller amounts of memory (under 512MB) it is unlikely that a typical XP desktop system will ever need 1.5 X 512MB or more of virtual memory. If you have less than 512MB of memory, leave the page file at its default size. If you have 512MB or more, change the ratio to 1:1 page file size to physical memory size.

8. Remove Fonts for Speed
Fonts, especially TrueType fonts, use quite a bit of system resources. For optimal performance, trim your fonts down to just those that you need to use on a daily basis and fonts that applications may require.
* Open Control Panel * Open Fonts folder

* Move fonts you don’t need to a temporary directory (e.g. C:\FONTBKUP?) just in case you need or want to bring a few of them back. The more fonts you uninstall, the more system resources you will gain.

9. Use a Flash Memory to Boost Performance
To improve performance, you need to install additional RAM memory. It’ll let you boot your OS much quicker and run many applications and access data quicker. There is no easiest and more technically elegant way to do it than use eBoostr (
http://www.eboostr.com).
eBoostr is a little program that lets you improve a performance of any computer, powered by Windows XP in much the same way as Vista’s ReadyBoost. With eBoostr, if you have a flash drive, such as a USB flash thumb drive or an SD card, you can use it to make your computer run better. Simply plug in a flash drive through a USB socket and Windows XP will use eBoostr to utilize the flash memory to improve performance.


The product shows the best results for frequently used applications and data, which becomes a great feature for people who are using office programs, graphics applications or developer tools. It’ll surely attract a special attention of laptop owners as laptop upgrade is usually more complicated and laptop hard drives are by definition slower than those of desktops.

10. Perform a Boot Defragment
There's a simple way to speed up XP startup: make your system do a boot defragment, which will put all the boot files next to one another on your hard disk. When boot files are in close proximity to one another, your system will start faster.
On most systems, boot defragment should be enabled by default, but it might not be on yours, or it might have been changed inadvertently. To make sure that boot defragment is enabled:

* Run the Registry Editor
* Go to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Dfrg\BootOptimizeFunction
* Set the Enable string value to Y if it is not already set to Y.
* Exit the Registry
* Reboot

.NET language conversion links

Following are the language conversion links:

Convert C# to VB.net
http://www.kamalpatel.net/ConvertCSharp2VB.aspx

http://www.eggheadcafe.com/articles/cstovbweb/converter.aspx

Convert VB.net to c#
http://www.developerfusion.co.uk/utilities/convertvbtocsharp.aspx

Monday, December 3, 2007

The Difference between GET and POST

When the user enters information in a form and clicks Submit , there are two ways the information can be sent from the browser to the server: in the URL, or within the body of the HTTP request.

The GET method, which was used in the example earlier, appends name/value pairs to the URL. Unfortunately, the length of a URL is limited, so this method only works if there are only a few parameters. The URL could be truncated if the form uses a large number of parameters, or if the parameters contain large amounts of data. Also, parameters passed on the URL are visible in the address field of the browsernot the best place for a password to be displayed.

The alternative to the GET method is the POST method. This method packages the name/value pairs inside the body of the HTTP request, which makes for a cleaner URL and imposes no size limitations on the forms output. It is also more secure.

ASP makes it simple to retrieve name/value pairs. If the forms output is passed after the question mark (?) on the URL, as occurs when using the GET request method, the parameters can be retrieved using the Request.QueryString collection. Likewise, if the form is sent using the POST method, the forms output is parsed into the Request.Form collection. These collections let you address the form and URL parameters by name. For example, the value of the form variable User can be passed into a VBScript variable with one line of script:
[% UserName = Request.Form("User") %]

You dont need to specify the collection ( Form or QueryString ) in which you expect to find the User parameter. The following is an equally valid method of searching for the User parameter:[% UserName = Request("User") %]

In the absence of a specific collection, the Request object will search all of its collections for a matching parameter. This is meant to be a programming convenience. However, the ASP Request object also contains collections for ServerVariables and ClientCertificates , which contain sensitive server and user authentication information. To avoid the possibility of spoofed values, which are values entered by the user in the URL, it is highly recommended that you explicitly use the collection name when searching for parameters from these collections.The following script combines a form and an action (the script that processes the form) into a single page. By posting the form data back to the same ASP page that displays the form, server-side script can process the output of the form. This is perfectly valid, and for simple script is often more convenient than posting to a second ASP page.

[%@ LANGUAGE="VBScript" %]
[!-- FILE: logon.asp --]
[HTML]
[HEAD]
[TITLE]Authentication Form[/TITLE]
[/HEAD]
[BODY BGCOLOR=#FFFFFF]
[% If Request.Form("User") = "" Then %]
[P]Please enter your Name:
[FORM ACTION="./logon.mspx" METHOD="POST"]
Your name: [INPUT TYPE="TEXT" NAME="User"]
Your password: [INPUT TYPE="PASSWORD" NAME="Pwd"]
[INPUT TYPE="SUBMIT" VALUE="Log On"]
[/FORM]
[% Else 'User verification and logon code goes here %]
Welcome [%= Request.Form("User") %]! [% End If %] [/BODY][/HTML]

Note If you use a separate ASP file to handle the processing of a form, the Request.Form collection will be emptied when you redirect to the new page. In order to retain the form values, you must copy them to the Session object from which they can be accessed on subsequent pages.

Although the sample authentication form shown here works, theres a good reason why you would not want to use it in practice. Logon information is sensitive and should be subject to rigorous protection from prying eyes. Although you can use the POST method to contain the users password within the body of the HTTP response, it is still possible to intercept and read it.

For mission-critical applications, IIS 5.0 provides both secure authentication with integrated Windows authentication and Client Certificates, as well as data encryption with Secure Sockets Layer (SSL).

What’s New in SQL Server 2008


SQL Server 2008 delivers on the following four key areas of the Microsoft data platform vision:

Mission-critical platform – SQL Server 2008 enables organizations to run their most complex applications on a secure, reliable, and scalable platform while enabling IT to reduce the complexity of managing their data management infrastructure. SQL Server 2008 provides a secure and trusted platform by securing valuable information in existing applications and enhancing the availability of the data. SQL Server 2008 introduces an innovative policy-based management framework that enables policies to be defined for explicit and automated administration of server entities across one or multiple servers. In addition, SQL Server 2008 delivers predictable query performance with an optimized platform.

Dynamic development – SQL Server 2008 along with the .NET Framework reduces the complexity of developing new applications. The ADO.NET Entity Framework enables developers to be more productive by working with logical data entities that align with business requirements instead of programming directly with tables and columns. The new Language Integrated Query (LINQ) extensions in the .NET Framework revolutionizes how developers query data by extending Visual C#® and Visual Basic® .NET to support an SQL-like query syntax natively. And support for occasionally connected systems lets developers build applications that enables users to take data with them on devices and later synchronize their data with central servers.

Beyond relational data – SQL Server 2008 enables developers to consume and manage any type of data from traditional data types to advanced new geospatial data. Developers can build next-generation database applications that feature new location awareness support and that enable document management capabilities.

Pervasive business insight – SQL Server 2008 provides a scalable infrastructure that can manage reports and analysis of any size and complexity while making it easier for users to access information through deeper integration with Microsoft Office. This enables IT to drive business intelligence throughout the organization. SQL Server 2008 makes great strides in data warehousing, enabling users to consolidate data marts in an enterprise data warehouse.

Sunday, December 2, 2007

NameSpaces for Web Services

Namespace assignment for Web Service proxies is not the most intuitive thing in the world since it's usually auto-generated. By default, the namespace is the name of the project dot name of the WebService.

Here's a little tip that I use when I can't get it right the first time: Go into the client project's Web References directory, the directory for your Web Service and open the generated proxy file (WebService.cs). At the very top you'll find the fully qualified namespace for the Web Service proxy which is what you need in order to access the Web Service methods from your code. You can either type the fully qualified namespace path plus the class name each time you create a reference to the object or you can add it to your namespace references at the top of the class.

Another hint – try to name your namespaces consistently. Avoid using the same name for a namespace and a class. When you create a project, create all client code in the scope of that namespace so other modules can reference each other by their specific single namespace name.

Faster access to SQL Server using the SqlClient data library


.Net supports data access through ADO.Net which provides a generic mechanism to access data. Unlike previous versions of data connector technology Microsoft decided to break with tradition and provide a customized library for SQL Server meant to improve performance by bypassing the high level OleDb API to access the database engine. The SqlClient namespace provides access to the SQL Server specific versions of objects for connecting and accessing the database while the OleDb namespace houses the same functionality for the generic OleDb driver access. You can use either for accessing SQL Server, but the SqlClient objects will be somewhat faster and specifically tuned for SQL Server.

In practice this means if you want to use the Sql Server specific classes you will need to bracket your code if you also want to support other data sources because the object names vary for each of the class libraries. All Sql Server objects start with Sql like SqlConnection, SqlCommand and SqlDataAdapter, while the OleDb versions start with OleDb.

The interfaces for the driver objects is open and extendable through .Net so you can expect to see other database vendors to build .Net classes that are optimized for their data engines. In fact building a new provider is relatively painless and involves override and implementing a handful of methods in the various data access classes (Connection, DataAdapter, Command), so it's possible to use straight .Net code to create a provider with much less effort than previously required through the OleDb interfaces.

Dataset objects in .Net

DataSets internally represent data in XML format and there's practically no overhead involved in generating XML from data contained within it. Depending on which parts of a DataSet object you access the representation of that data can always presented as XML.
DataSets are also very flexible in dealing with incoming XML data. Pass it some data in XML format and .Net will create a DataSet from it. This means that a DataSet can work without any underlying datasource – instead using XML as the data store. You can load an XML document into a DataSet, modify the data and then write it back to disk for persistence. This is perfect for configuration information or other local data that does not rely on complex rules or has to deal with multi-user concurrency issues.

It also means that you can use a DataSet as a mechanism to navigate XML instead of an XML reader or DOM parser. Take note DataSets are about more than just mere data!

Microsoft Surface in brief

After years of covert development, Microsoft says it will release a computer that uses the tabletop as its high-resolution display, recognizes objects placed on the surface and skips the traditional keyboard and mouse in favor of fingers on the screen.

What it is: A computer in the form of a table, using the hard acrylic tabletop as a high-resolution screen. First product from Microsoft's previously secret Surface Computing team, which has 120 employees.

How it works: The surface itself isn't touch-sensitive, but a series of cameras inside the table can see when someone places or drags a finger, hand or any other object on or across the tabletop screen. Internal projector lights screen from beneath.

Interface: People can use their hands to touch and move virtual objects on the screen, just as they would with a mouse on a traditional PC. The system also can recognize objects placed on the surface, based on their shape or on special codes affixed to them.Size: 22 inches high, 21 inches deep, 42 inches wide, with 30-inch screen.

Technology: Uses a custom software interface on top of Microsoft's Windows Vista. Comes with wired Ethernet, integrated Wi-Fi and Bluetooth wireless, hard drive and 1 GHz processor.

Initial Customers: Harrah's Entertainment, Starwood Hotels and Resorts, T-Mobile and IGT, the gaming technology company. Microsoft says consumer availability is still a few years away.

The company envisions a variety of uses. In one example, people place a card on the table to call up a virtual stack of digital photos from a computer server and then rotate, resize and spread them across the table using their hands. In another, diners split a tab by dragging icons of their meals to their credit cards.

Whether the technology catches on remains to be seen. Microsoft isn't the only company eyeing the market. But in the meantime, it isn't science fiction: Microsoft has been showing functioning models for months in closed-door sessions.

The company is slated to publicly unveil the machine -- dubbed "Microsoft Surface" -- at a Wall Street Journal conference Wednesday.
Microsoft says businesses will start deploying the machines in retail and entertainment settings in November. Starwood Hotels and Resorts, Harrah's Entertainment and T-Mobile are among those planning to use Microsoft Surface.

Longer term, the Redmond company says, it is aiming for the broader consumer market.
"We think this is a multibillion-dollar industry," said Pete Thompson, general manager of Microsoft Surface Computing. "We think this is something that is going to be pervasive. ... We don't want it to be a novelty."

The product is coming out of the same area of Microsoft that develops the company's profitable line of keyboards and mice.

However, Microsoft's whiz-bang technologies haven't always caught on with consumers. Past flops include the smart watch and the Portable Media Center. Its Zune music player was a belated response to Apple's dominant iPod. And the Tablet PC, a conceptual forerunner to the new machine, hasn't caught on in the way Microsoft Chairman Bill Gates predicted.
In that way, the Surface machine will test anew Microsoft's ability to strike a chord with consumers, and to expand beyond its traditional Windows and Office software businesses.
Price will be a major obstacle for the new machine to overcome if it is to catch on with mainstream consumers. Thompson declined to disclose terms of the enterprise agreements under which Microsoft is selling the machines, which include related software and services.
However, he said, "If we made this a product sale, think of it in the range of $5,000 to $10,000 per unit."

It could be three or more years before it hits the broader consumer market, he said.
"The potential is there," said Doug Bell, industry analyst with the IDC market research firm. "Once you get this into hotel rooms or consumers' homes and you bring the price point down, the market could be there. It just needs to be created at this point."
Microsoft will start with several "showcase" commercial deployments numbering in the dozens of units, Thompson said. Microsoft hopes consumers will want the machines after using them in commercial settings.

In one example, the Surface computer could recognize a phone pulled off the wall and placed on the tabletop by a customer at a T-Mobile store. It would display features of the phone, show a pricing list and let the customer drag icons representing elements of a service plan onto the phone, before sending the virtual package to the register for purchase.
Harrah's plans to start by using the Surface tables as a "virtual concierge" desk in conjunction with its Total Rewards loyalty program at its eight Las Vegas properties, which include Caesars Palace and Bally's. People will be able to use the tables to access maps of the different properties, get details about events and venues and create itineraries for themselves.
Further down the road, Harrah's is exploring options including food and beverage ordering, and possibly gaming or game-related activities, said Tim Stanley, Harrah's chief information officer.
Another business that plans to use the Microsoft Surface computers is IGT, the gaming technology company.

Starwood plans to make the machines available in public areas of select Sheraton properties, including in Seattle, starting later this year. It's exploring possible uses including music playlist browsing, photo sharing, games, food ordering and the virtual concierge idea.
"We were just wowed," Hoyt Harper II, a Sheraton senior vice president, said of the first time hotel executives saw the machine. "From the get-go, you could tell it was something unique and different and special."

Microsoft isn't the first to show such a machine. Jeff Han of New York University's Courant Institute of Mathematical Sciences has demonstrated similar prototypes at past Technology Entertainment Design conferences, and he has formed a company to market the technology.
Han's examples include a tabletop photo-sharing scenario similar to one that Microsoft has been showing recently.

Microsoft Surface, created under the code name "Milan," is the first product from the Surface Computing team, a hitherto unknown group that has grown, under the radar, to 120 people. The machine uses a specialized interface on top of Windows Vista. But with the product, Microsoft is breaking from its traditional PC model by offering hardware, not just software.
The table is about 22 inches high and 42 inches wide, with a 30-inch screen. It can be used simultaneously by multiple people sitting on different sides of the table. The components of the machine are inside the table, including a hard drive and a standard 1 GHz computer processor.
It's not a touch-sensitive screen. Instead, it relies on multiple cameras beneath the table that can see when someone touches it. It recognizes objects based on shape or by using domino-style identification labels on the bottom of the objects.

A projection system and optical technology sit beneath the hard acrylic tabletop screen, which itself doesn't contain electronics. Microsoft says it should be durable enough to serve as a restaurant table, spills and all.

Microsoft says it eventually plans to expand into other shapes and sizes of surface computers, including versions that could hang vertically on a wall.
The company says the product's genesis came in 2001, arising from brainstorming sessions between Andy Wilson of Microsoft Research and Stevie Bathiche of Microsoft Hardware.
Wilson has shown certain elements of the surface-computing technology publicly, as has Gates. But the company has kept its product plans under wraps until now.
"Bill wanted to announce this years ago. ... The reason we haven't done that is because we wanted it to be real," Thompson said. "I don't want it to just be nifty technology."


Thursday, November 29, 2007

Authenticating Users with ASP.NET AJAX



ASP.NET 2.0 provides built-in membership management capabilities that allow applications to log users into and out of a Web site with minimal coding. Simply run the aspnet_regsql.exe tool to add Microsoft's membership database into your chosen database server (otherwise, a SQL Server 2005 Express database will be used), add a few lines of configuration code in web.config to point to your database, drag on a few controls such as the Login and CreateUserWizard controls, and you're ready to go!

However, each time a user logs in to your application, a postback operation occurs which, in some situations, may not be desirable. In cases where you'd like to log users into a Web site without performing a complete postback of a page, you can use the ASP.NET AJAX authentication service instead.

The authentication service consists of a service that lives on the Web server that accesses membership information from the database, as well as a client-side class named AuthenticationService (located in the Sys.Services namespace) that is built into the ASP.NET AJAX script library. The AuthenticationService class knows how to call the membership service using the XmlHttpRequest object behind the scenes.
To use the AuthenticationService class to log users in or out of a Web site, you must first enable the authentication service on the server. This is done by adding code into web.config as shown below.


NOTE: instead of '<' symbol i have used '[' and for '>' I have used ']' symbol.


[system.web.extensions]
[scripting]
[webServices]
[authenticationService enabled="true" /]
[/webServices]
[/scripting]
[/system.web.extensions]

This code enables calls to a file named _AppService.axd to be made behind the scenes and allows membership credentials to be passed and validated. _AppService.axd doesn't actually exist as a physical file; it's really an alias for an HttpHandler named ScriptResourceHandler that's responsible for handling log-in and log-out functionality within ASP.NET AJAX applications. ScriptResourceHandler is configured automatically when you create an ASP.NET AJAX-enabled Web site in Visual Studio .NET 2005, as shown in the following code:

[httpHandlers]
...
[add verb="*" path="*_AppService.axd" validate="false"
type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions,
Version=1.0.61025.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"/]
...
[/httpHandlers]

Once you've enabled the ASP.NET AJAX authentication service in web.config you can use the client-side AuthenticationService class to log users into a Web site using an asynchronous postback operation. The AuthenticationService exposes login() and logout() methods, as well as several different properties.

1 defaultFailedCallback : Gets or sets the default failure callback method.
2 defaultLoginCompletedCallback: Gets or sets the default login callback method.
3 defaultLogoutCompletedCallback: Gets or sets the default logout callback method.
4 isLoggedIn : Used to determine if the user is currently logged into the application or not.
5 path : Gets or sets the authentication service path.
6 timeout : Gets or sets the authentication service time-out value.

The AuthenticationService's login() method performs an asynchronous postback operation that calls the ScriptHandlerFactory HttpHandler mentioned earlier to log a user into a Web site. The overall process still involves setting a cookie containing the ASP.NET membership authentication ticket in it as with standard ASP.NET applications, but the cookie is set without reloading the entire page. The login() method accepts several different parameters, as shown here:

1 userName The user name to authenticate.
2 password User password to use while authenticating.
3 isPersistent Determines if the issued authentication ticket should be persistent across
browser sessions. The default is false.
4 customInfo Reserved by Microsoft for future use. Defaults to null.
5 redirectUrl The URL to redirect the browser to on successful authentication. If null, no
redirect occurs. The default is null.
6 loginCompletedCallback The method to call when the login has finished successfully. The
default is null.
7 failedCallback The method to call if the login fails. The default is null.
8 userContext User context information that you are passing to the callback methods.

You can see that login() takes quite a few parameters, although several of them are optional. The key parameters are userName, password and loginCompletedCallback.


the AuthenticationService's login() method to attempt to log a user into a Web site. The code first calls the AuthenticationService class's login() method and passes in the user name, password, log-in completed callback handler and failure handler.

If the log-in attempt completes successfully, the method named OnLoginCompleted() is called. You know if the user successfully logged in or not by checking the isValid parameter. If the log-in attempt fails due to the service being unavailable or other circumstances, the OnLoginFailure() method is called, letting the user know that they're not able to log in at this time.

To log a user out of a Web site, you can call the AuthenticationService's logout() method. Be aware that this method will cause a full-page postback operation to occur to ensure that the authentication cookie is properly removed from the user's browser. This is standard behavior, so don't waste any time trying to figure out why an asynchronous postback isn't occurring. Parameters that the logout() method accepts are shown here:

1 redirectUrl The URL to redirect the browser to on successful logout. The default is null.
2 logoutCompletedCallback The method that is called when the logout has finished. The default
is null.
3 failedCallback The method that is called if the logout has failed. The default is null.
4 userContext User context information that you are passing to the callback methods.


Calling the logout() method to remove the authentication cookie from the users browser and log them out of a Web site. It defines a log-out completed callback method, as well as a failure callback method.


ASP.NET 2.0 membership management

Membership and Role Manager Providers - ASP.NET 2.0 now includes built-in support for membership (user name/password credential storage) and role management services out of the box. Because all of these services are provider-driven, they can be easily swapped out and replaced with your own custom implementation.


Login Controls - The new login controls provide the building blocks to add authentication and authorization-based UI to your site, such as login forms, create user forms, password retrieval, and custom UI for logged in users or roles. These controls use the built-in membership and role services in ASP.NET 2.0 to interact with the user and role information defined for your site.

Monday, November 26, 2007

Introduction to .NET 3.0 for Architects


Check this site to know .NET 3.0 Architecture

ASP.NET 2.0 New Features

Click this link to see the features

ASP.NET Tips

Tip: Do not use the AutoPostBack attribute on the DropDownList control to simply redirect to another page.
There are probably cases when this might make sense, but for the most part it is overkill. Using the autopostback for a redirect requires extra roundtrip to the server. First the autopostback returns to the server and processes everything up to the event handling the postback. Next a Response.Redirect is issued which goes back to the client requesting the client use another page. So you end up with two separate requests + processing just to get a user to another page.
Using the onchange event of the select element, we can do this all on the client. In the sample below, I am simply redirecting to the current page with an updated querystring element. Your logic will vary, but in the case below, I am avoiding the zero index.
0) { window.location = window.location.pathname + '?t=' + this[this.selectedIndex].value;}" />

Tip: Never use the ASP.Net Label control.
Ever is a strong word, but except for some quick and dirty style hacks you should never ever use this control. Any text is rendered inside a span control which is usually unnecessary and complicates any CSS styling you may be trying to use. In most cases, you can replace the Label with a Literal and achieve the same results.

Tip: Use the ASP.Net Repeater instead of DataList, DataGrid, and DataView controls
The Repeater is the single most powerful control shipped in ASP.NET. It is versatile and lightweight. There are times (especially prototyping) when the other databound controls make sense to use, but they generate a lot of extra markup and generally complicate the page with all of their events and styling. Using the Repeater, you may write a little more code up front, but you will be rewarded in the long run.

Tip: Understand how to effectively use caching.
By now, most ASP.NET developers know about the Cache. Most examples show the virtue of caching for hours at a time. Very little data that is worth the effort to display on a web page requires caching for this long. The main reasons for caching are performance related. Memory in ASP.NET is still a very limited resource. Do not waste it by caching anything more than a couple of minutes unless it is very expensive to regenerate.

Tip: Always set a memory threshold for your AppPool.
A related tip would be to first understand the total memory available on your box: how many sites are there, is SQL running locally? Is there anything else on this box which will consistently use Memory?
In most cases, you should never set the available memory for an AppPool above 800mb's unless you can also set the 3/gb switch (then you can use about 1200mb). Allowing memory to go unchecked or set about 800mb can bring even a moderately sized site to it's knees once too much memory is used.

Tip: Use AppOffline.htm for updates
If you are making any changes to files in your bin directory, always use the AppOffline.htm file. It is very likely that while you uploading (or copy & pasting) your updates, users will see an error message. It is much better to show them one that you purposely created and can explain the situation vs. the built in ASP.NET error pages (or even your own customError page). In addition, this will help prevent the locking .dll issue that is not supposed to exist anyway.

Tip: Always check Page.IsValid in your button's EventHandler.
Just because you are using ASP.Net validation controls, do not assume the page could not be submitted with invalid data.
Also, just because you hide a control, do not assume buttons/textboxes/etc on it are not submit-able. It is perfectly fine to hide a control that a user should not access, but with very little code (or using a third party tool) users can easily make an HttpPost with any data they choose.

Tip: When redirects are permanent, use a 301 status code.
This use to be a little more manual, but with ASP.NET 2.0, it is even easier:
Response.RedirectLocation = "http://site.com/newlink.aspx";Response.End();

Tip: To create fully qualified URLs, use the new VirtualPath class.
string relativePath = "~/somefolder/test/123.aspx"Uri newUri = new Uri(Request.Url, VirtualPathUtility.ToAbsolute(relativePath));
Again, please add any other suggestions below. I am looking forward to reading them.