devxlogo

Microsoft .NET and Java: Achieving Interoperability

Microsoft .NET and Java: Achieving Interoperability

s we head into 2004, for me it’s a time to reflect how far our industry has evolved over the past twelve months, and the pace at which new technologies and ideas are introduced. During the past few years, organizations recognized both the need and the benefits of the .NET and Java platforms, recognized that each has specific costs and benefits, and built applications that run on both. Not surprisingly, given that mix of platforms, they’re now finding an increasing need for interoperability, and are starting to consider the best approach for achieving that interoperability.

Interoperability through Web services has received the greatest amount of attention, and in this last year we’ve seen many advances in Web services stacks from both Microsoft and many Java vendors. However, Web services are not the only option to create an interoperability solution. Other options exist that provide interoperability through means other than Web services. These include numerous third party solutions, asynchronous interoperability using databases, and the use of message queues to relay messages between the two.

Despite all of this new technology and many articles and publications on the subject however, many customers are still unsure which interoperability option they should select. Should they strategically invest in developing applications that use Web services, or instead select a more tactical option that’s more inline with technology they’re already using? What are the performance costs of each option? Should they align their options with the major platform vendors or instead invest in other third party solutions?

This series of articles on the subject uncovers and presents options that exist today to achieve bridge the gap between Microsoft .NET and Java?interoperability using Web services, binary communication, CORBA and resource tier solutions. For each option, you’ll see typical scenarios in which the technology can be applied, and weigh up the pros and cons to help address questions and concerns when selecting a technology for use in your own environment.

Web Services
If you were to take the industry as a whole in 2003, it would be fair to say that Web services have been a cornerstone for almost every major vendor’s line of products. The foundation of Web services in the Microsoft platform continued through the release of v1.1 of the .NET Framework and support for Web services Enhancements (WSE) v1.0 in 2003. IBM continued to increase Web services support through the WebSphere 5.x platform and their Emerging Technologies Tool Kit (ETTK), an AlphaWorks release. Other major J2EE application vendors also strengthened their support for Web services, and even smaller vendors underwent some changes?the most notable of which are webMethods acquisition of The Mind Electric towards the end of the year.

Given this rich landscape of support for Web services, you may be wondering if and how all the vendors ensure that their implementations play well with others. This is the job of the Web services Interoperability Organization, more commonly known as the WS-I . The WS-I is a consortium of over 170 vendors, system integrators, and enterprise customers with the overall goal of ensuring and promoting Web services interoperability throughout the industry. 2003 was a notable year for the WS-I because it marked the release of the ‘Basic Profile 1.0’, a documented profile of exactly what it is that makes up a Web service. For v1.0, this is a set of existing specifications (SOAP 1.1, WSDL 1.1, UDDI 2.0, XML 1.0, XML Schema and HTTP 1.1), tied together with a set recommendations outlining the role of each standard and the resolution of many potential interoperability issues.

So, given this huge industry momentum, the flexibility and extensibility of SOAP and the ubiquitous way in which Web services traverse both firewalls and proxy servers, you may ask: Why is there a need for any other interoperability solution apart from Web services? The answer typically lies in what Web services cannot achieve today.

Data passed using Web services inherently depends on XML and XSD. Web services pass SOAP messages encapsulating this XML data to another party using HTTP. For interoperability between .NET and J2EE, this can perpetuate two potential issues.

First, serializing data from in-memory objects to XML can be relatively expensive, both computationally and for the resulting size of the data. Taking large objects and converting them to an ASCII based XML representation can?in some cases?produce large documents. When you’re passing a single message or document across a network each day, that’s unlikely to be an issue. If however, you need to send thousands of messages per second between two systems; the additional overhead can be a consideration.

With that overhead in mind, I think there is also an element of perception of overhead to this argument. Many arguments against Web services start from the point of view that translating objects into XML, creating SOAP messages and sending them across HTTP is always going to be slow, often to the point of being unusable. Some organizations have even refused to adopt any Web services strategy for exactly this reason. My advice? If that’s your concern, prove it. Take data that is representative of your needs and build a proof-of-concept application that passes that data between platforms using Web services. With the advances that have been made in both Web services stacks and XML parsers, many who try this are surprised at how efficient modern Web services are.

The second issue with Web services and interoperability is the HTTP protocol itself. With little doubt, HTTP is the most universal protocol in use for Internet technologies today?and for many applications HTTP is the perfect transport for Web services. HTTP however, is a request-response style protocol. The client makes a call, and expects a response within the lifetime of the call. Some applications instead need to work within an asynchronous model?for example a loan or credit card application process may take several hours to complete, and it doesn’t make sense for the client to hold open the call channel for that duration.

Applications today normally overcome this type of ‘asynchronous call’ by having the client regularly poll the server to see if a requested operation has completed. While polling works, it’s inefficient, and those inefficiencies sometimes demand other transports to return data to the client. Another transport, for example, could be a TCP channel that the client can listen to, or the server could even return the Web services response as an SMTP message (what better proven way to send a message back to a client?).

While this is all possible within the model of Web services, most implementations today support only HTTP. Other transports are gradually starting to be recognized and adopted, but this will take time.

Binary Interoperability
In areas where Web services do not adequately facilitate interoperability today, what are the alternatives? For those concerned with performance and the size of serialized data, one option is to choose a binary interoperability solution. Binary interoperability methods take objects from one platform, serialize them into a binary stream, send them across the network and then de-serialize the same data on the other platform. Because both parties agree on the binary format to use, the serialization ensures that the binary data is successfully mapped to local data types for each platform

One way of achieving this today can be to use .NET Remoting. .NET Remoting is a wire-level protocol, introduced with v1.0 of the Microsoft .NET Framework, and is open to license. As a result, third party options that offer Java implementations of the .NET Remoting protocol have appeared on the market?one of which is JNBridge Pro.

Toolkits such as JNBridge let applications wrap compiled Java objects (even EJBs) and expose them to .NET callers using the .NET Remoting protocol. Because the protocol supports a binary channel, it can help reduce the packet size of calls going across the network, which in turn can lead to better performance than an approach using XML serialization. In addition, .NET Remoting can be used for connections that hold state, whereas Web services are typically designed for stateless calls. .NET Remoting is also useful for Inter-Process Communication (IPC), where components on each platform are running simultaneously on the same machine. For example, using IPC, you could have a local Java application using Java SWING talking to another local application written in .NET managed language such as C# or Visual Basic.NET.

The advantages of binary communication do have a price. Applications that use .NET Remoting typically live within the enterprise?and are rarely exposed to other organizations through firewalls or proxy servers (although .NET Remoting does support an HTTP channel and SOAP formatter). This occurs primarily because the data types exposed by a .NET Remoting server are based on the CLR (Common Language Runtime), whereas the WS-I basic profile (and other Web services implementations) rely on more standardized XSD style data.

In addition, using a binary channel tends to enforce tight coupling with interfaces that are exposed, meaning that if the methods of exposed components change when using .NET Remoting, you normally have to re-generate stubs and recompile both the client and the server. In contrast, SOAP is far more extensible; you can include additional data in the message header without having to modify the interface (WSDL document).

In his article, “Calling Java Classes Directly from .NET,” Wayne Citrin, the CEO of JNBridge, will discuss more about his company’s product and potential solutions in which you might prefer to achieve interoperability with .NET Remoting. In a related article, “Hosting NET Controls in Java,” Heath Stewart will talk more about a .NET Client consuming Java classes using a similar approach.

Interoperability Using CORBA
If binary interoperability is a must, but .NET Remoting is not a recognized standard within an organization, another alternative exists for organizations that have standardized on CORBA (Common Object Request Broker Architecture).

The last twelve months have seen the introduction of a number of open source and commercial products that provide .NET clients with the ability to call and invoke remote objects written to the CORBA specification?and that aren’t limited to interoperability only with Java-based applications. Typically, these products enable a .NET client to use IIOP (Internet Inter-ORB Protocol) to invoke remote components. One such commercial product is Borland’s Janeva.

This approach is most useful for organizations that have deployed CORBA server-side components but who do not wish to make any changes to them. Although using .NET Remoting provides binary interoperability, many toolkits still require some modification to server-side objects before clients can call the server component. Using a toolkit that allows a .NET client to natively reference a CORBA object can overcome this by requiring modifications only to the client.

One disadvantage is that the approach mandates “.NET client to Java server” style architecture. In many applications there are occasions where Java components need to invoke remote .NET objects?a good example of this is where an organization is implementing a Service Oriented Architecture (SOA) and has a combination of services developed using both technologies.

In his article on Borland Janeva, “Connecting CORBA to .NET,” Bob Swart will cover this scenario in more detail and look at the types of applications that can benefit from this approach.

Resource Tier Interoperability
Interoperability solutions using Web services, .NET Remoting, or CORBA are typically synchronous in nature, and occur between two parties (a client and a service). In typical calls, the client calls the service, some data is returned, and the call is finished.

In applications that need to behave in a more asynchronous fashion, interoperability at the resource tier?which implies using either a database or a message queue?may be the answer. Using a database or message queue to connect platforms based on .NET and J2EE can be one of the easiest ways to achieve interoperability between the two. Database solutions typically share a table between the two platforms, and each uses its preferred method for connecting to the database (ADO.NET for .NET, JDBC for the Java platform). Investing in stored procedures at the database tier can also help in reducing duplication of code.

Interfacing with message queues works in a similar way, although each platform will normally have to obtain a driver from the message queue vendor in order to establish a connection. This may be a JMS (Java Message Service) driver for the J2EE platform, or a specific set of classes for .NET. For example, IBM offers WebSphere MQ 5.3 drivers with similar style interfaces for both Java and .NET. Many message queue vendors now also offer the ability to communicate via more standardized interfaces such as Web services.

In addition to providing good support for asynchronous style calls, using the resource tier can prove beneficial for n-n style interoperability, where multiple clients need to communicate with multiple servers. In the majority of the previous interoperability solutions, each client needs to know the location of each server. In a scenario using an intermediary database or message queue, although both sides must agree on the format of the message, the client does not necessarily need knowledge of the service location?which both negates the need for location-based information and can also help provide support for failover and load balancing.

For all the advantages that a database or message queue provides however, they really are designed for asynchronous style communication. Using them in a synchronous style call (for example, where an ASP.NET page required a response before displaying the result to the user) can lead to potential performance issues. In addition, using either a database or message queue introduces yet another piece to the interoperability puzzle. For situations where many machines have a requirement for interoperability, it may be worth the investment in administration and additional machines; but in a scenario where just one client needs to interoperate with a single service, this solution just introduces potential overhead.

In his article, “Java/.NET Interoperability via Shared Databases and Enterprise Messaging,” Kyle Gabhart investigates the approach of using either a database or message queue to achieve interoperability between Microsoft .NET and Java.

Compare, then Decide
It is fair to say that there’s no universal answer for interoperability today. While Web services offer incredible potential moving forward, applications already in place today may require more of a tactical approach. To help you differentiate between the various approaches, Table 1 lists some of the advantages and disadvantages of each.

Table 1. Interoperability Comparison Matrix: The tableshows some of the advantages and disadvantages of each interoperabilitysolution.

 

Integrated Platform Support

XML / SOAP Interoperability

Binary Interoperability

Designed for Synchronous StyleCalls

Designed for Asynchronous StyleCalls

Requires modifications toexisting server side components

Extensible without re-compile

Web services

Yes

Yes

No

Yes

No**

Yes***

Yes

.NET Remoting

Partial

Yes*

Yes

Yes

No

Partial

No

CORBA Interoperability

Partial

No

Yes

Yes

No

No

No

Resource Tier

Yes

Optional

Optional

No

Yes

N/A

No

* Uses CLRdata types, not XSD
** Assuming HTTPas the transport
*** Themajority of modifications are required to ensure the data types exposed are XSDcompliant

We hope that this special report offers you some insight into the merits and suitability of the various interoperability approaches.

devxblackblue

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.

About Our Journalist