WAVE Report

Microsoft Essential Web Services Summit
By James Sneeringer

WAVE 0146, WAVE0147 10/18/01

The battle over "Web services" seems to dominate the coverage of the IT industry today. But a lingering question is: what exactly is a Web service? How does it work, what technology does it rely on, and why is it desirable? A major question for many is Microsoft's .NET initiative - how it will work, how it will provide Web services, and how (and whether) to update.

In light of these and other concerns, the WAVE Report attended a one-day developer's seminar sponsored by Intel and Microsoft, called The Essential Web Services Summit.

Ted Pattison, an instructor for DevelopMentor, a software development training company, led the seminar. The seminar was split into a morning section addressing the architecture and development environment of .NET, which we will cover in this article, and an afternoon overview of Web services, which we will cover in the next issue of the WAVE.


Microsoft bills the .NET architecture as the biggest change in their operating systems since the transition from DOS to NT. It represents a departure from ongoing Windows development standards such as Win32, COM and DCOM, and the C++ programming language. VisualBasic.NET (VB.NET) is the first version of the Visual Basic (VB) programming language for which Microsoft was not concerned about backward compatibility. In essence, .NET is a completely new architecture for developing and running applications.

Pattison presented three main forces driving the development of .NET:

1) Microsoft is seeking to ease the transition from 32- to 64-bit-based computing, having learned from the complications of the 16- to 32-bit switch.

2) Microsoft is seeking to compete with Java on lightweight devices such as PDAs. They want to make it possible for established VB and C programmers to write code that will run well on such devices.

3) Microsoft is seeking to expand its customer base to include Unix- or other OS-based servers. This is in anticipation of a plateau (already beginning) in sales of PCs and traditional Windows, and an ensuing switch to a service-based business model.

How It Works

Pattison stressed that platform independence was the overriding goal of Microsoft when developing .NET. He stated they felt the way to achieve this was to use established standards where possible.

A Windows application consists of many separate components that must be found and integrated to run the program. The existing architecture, known as COM, handled both in-memory (single PC) and across-boundary (networked) integration of application components. It assumed Windows was ubiquitous. The .NET architecture does not. Under .NET, in-memory integration is handled by what Microsoft calls the Common Language Runtime (CLR), an "execution engine" that works solely within a single machine. Cross-boundary integration is based on established standards of XML and HTTP.

Under COM, a program that was written in VB or C++ was compiled directly from that language into machine-readable code, locking it into the Intel x86 chip and a specific version of Windows. Under .NET, programs written in the new languages of VB.NET or C# (pronounced "C sharp," this replaces C++), are compiled to an Intermediate Language (IL). When an application is launched, the CLR compiles and manages the IL code in real time, as it runs. This style of processing has several effects:

First, so long as it is running the CLR, any device can run code that has been compiled to IL. Thus, code written once in VB.NET or C# can be used on many different devices.

Second, IL code must be extremely self-descriptive, so that the CLR can locate and organize components. In .NET, each assembly of code contains extensive information about itself arranged in XML, called meta data. When running the code, the CLR uses a process called "reflection" to read the meta data.

Third, IL code is strictly managed by the CLR as it is compiled and run. A major change is that allocating memory is no longer allowed in the code - it is now handled by the system.

Fourth, the capability differences between the VB and C programming languages are largely gone, since both must compile to the same IL.

Fifth, all application source code can now be read in IL, and more easily reverse-engineered. While there are security measures to prevent tampering, according to Pattison, the only way to completely hide code in .NET is to keep it on a Web server behind a firewall, and run the application as a Web application (more on that in the next WAVE issue).

Elephant in the Room

During this section of the seminar, a large elephant stood in the corner of the room, and its name was Java. Several times, Pattison referred to aspects of .NET as being "just like Java" or "Java-like." For instance, CLR, the virtual "execution engine" of .NET, is the same concept as the Java virtual machine - although the concept actually predates both architectures. In another example, the term "reflection" for reading meta data is lifted directly from Java. It wasn't clear how much resentment this generated among the Java developers in attendance, although we did overhear two discussions to that effect during breaks. Pattison acknowledged up front that when developing .NET, Microsoft architects had considered existing technologies, and taken things from both COM and from Java.


.NET includes two features that Microsoft claims will increase security. First, the meta data for each code assembly includes a field for a public key token. This allows the assembly code to be "locked" when it is compiled. If the code is changed in any way, it will not compile again without the matching private key. Pattison demonstrated how the system is sensitive to changes of even one bit. Microsoft hopes this will prevent tampering.

Second, and a much larger change, programmers can no longer use pointers or allocate raw memory. This is important since many security failures in Windows have been the result of memory errors. Pattison stated that Sun and Microsoft agree: if a programmer can access raw memory, they can defeat any security. Now that the system, not the code, handles memory, Microsoft believes the system will be much more secure.

Windows XP

With the release of XP less than 24 hours from the time of this issue, the question of whether or not to upgrade from Windows 2000 weighs heavily. The answer, from the development side, is no. While XP offers substantial changes in interface and features, the capabilities of .NET are already available in Windows 2000. Our next issue will focus on the most-hyped of those capabilities: Web services.


**Second Section - WAVE0147 11/2/01**

An industry buzzword, and an integral part of Microsoft's .NET initiative, is the concept of Web services. The important questions are: what is a Web service, what is required to develop one, and why would a business employ one? In the afternoon session of the Microsoft/Intel Essential Web Services Summit (see WAVE0146 for introductory article) DevelopMentor Instructor Ted Pattison provided some answers from the Microsoft point of view.

As Microsoft defines it, a Web service is a distributed application that operates across the Internet. Most applications are made up of a number of individual program and data files, which are called as needed. A Web service operates the same way, but the files exist on separate machines on the Internet. Web services operate behind the scenes, from program to program, but users experience the application as though it resides entirely on their machine. Pattison contrasted this definition with that of Web applications, which users access directly through a browser.

Web services are one way for applications to communicate with each other. The primary advantage over other methods is that, at least in theory, Web services are platform-agnostic. The key is using established standards such XML and HTTP, rather than proprietary networking protocols. In addition to communication advantages such as the ability to work through firewalls and across different platforms, using established standards may also save development dollars. HTTP can balance loads across the network, for example, and security can be handled either by a standard such as SSL, or a third-party service. Microsoft's Passport service was highlighted at this conference, but in fact, the provision of this type of user authentication service will be one of the major battles of the Web services market - between Microsoft and AOL/Time Warner.

What You Need to Know

Pattison spelled out the four technologies that go into the development of a Web service:

XML is short for Extensible Markup Language. It is a very flexible way to represent data, developed from the HTML system of tags. XML data is self-descriptive: a single XML file contains both a set of data, as well as information called meta data about what types of data are present, and how they interrelate. Applications reading such a file not only receive the data, but also the ground rules for how that data can be manipulated. XML places no limitations on how meta data can be defined.

XSD is a standardized schema for XML. It sets forth a limited set of data types that can be defined in XML, and specific rules under which users can define new types. By limiting the meta data in XML, XSD improves compatibility with standards such as Java, or Microsoft's CLR (see last WAVE). XSD aims to ensure that an XML file can be mapped to these other standards.

SOAP is the Simple Object Action Protocol. It is a standard for messaging - the process of two machines communicating before and after file transfers. SOAP defines how XML can be used by one machine to request a file, or acknowledge a file receipt.

WSDL, the Web Services Description Language, defines the way Web service interfaces are described from one machine to another. It is a standard by which one machine can let another know what Web services are available, and how they are configured.

Because the concepts are abstract, an example might be helpful. Imagine a local Windows computer initiating a Web service across the Internet with a distant Java machine. It communicates with the distant machine by sending and receiving SOAP messages. WSDL defines the Web services available on the distant machine. XSD allows the original Windows file being sent to be mapped into XML, and remapped to Java at the distant machine. HTTP handles the file en route, and security is handled by SSL.

Microsoft ASP.NET

Pattison also spent over an hour discussing the intricacies of Microsoft's new tool to develop Web services, ASP.NET, which replaces Active Server Pages (ASP). The tool is designed to allow programmers to develop both Web applications and Web services. Two major changes to the ASP environment: a scripting language is no longer required, and state management works across Web farms, rather than being tied to a single machine as before. State management is the process of tracking each user's state (i.e. logged-in or logged-out for, example).

Like the rest of .NET, ASP.NET is based on the Common Language Runtime (CLR), a virtual execution engine. ASP.NET is still, however, tied to Microsoft's Internet Information Server (IIS), which has been the focus of numerous security exploits, and a great deal of criticism. Microsoft has defined several new extensions to denote Web services and applications from traditional ASP pages. When IIS detects one of these extensions, the request is passed through IIS into the ASP.NET runtime, where it is run by the CLR, rather than within IIS. Because the CLR compiles just in time, and manages the code it runs strictly, Microsoft predicts Web applications and services will be much more secure than previous Microsoft Web server tools. Several developers at the conference we spoke with, however, had the opinion that as long as IIS was involved at all, security could be too-easily defeated. Pattison hinted strongly that future versions of Microsoft Web server software would not include IIS, or that it would be optional.