Click Here to Install Silverlight*
United StatesChange|All Microsoft Sites
|Developer Centers|Library|Downloads|Code Center|Subscriptions|MSDN Worldwide
Search for

Advanced Search

March 1996

Microsoft Systems Journal Homepage

Exploring the Client/Server Capabilities of Visual Basic 4.0 Enterprise Edition

Guy Eddon and Henry Eddon

Guy Eddon is a freelance software developer, author, and instructor. Henry Eddon is also a software developer. Both can be reached at

Click to open or copy the VBDB project files.

The new Enterprise Edition of Visual Basic 4.0 introduces several new features that make writing efficient and robust client/server applications easier. In this article, we'll explore the following features of the 32-bit version of Visual Basic® 4.0 Enterprise Edition as they apply to client/server application development.

  • The Microsoft® Jet database engine
  • The Jet data control and data-bound controls
  • Data Access Objects
  • ODBC and the ODBC API
  • Remote data objects and the Remote data control
  • Remote automation

We'll also explain how these features can be implemented in various client/server environments. Finally, we'll present the VBDB timing application, an application we wrote to compare the performance of the three major access methods discussed in this article (DAO, RDO, and ODBC).

The Microsoft Jet Database Engine

In 1993, Visual Basic 3.0 arrived with a built-in data control (which we'll call the Jet data control to differentiate it from the new Remote data control). Visual Basic 3.0 also provided other data-aware controls that could be bound to a Jet data control, and an object-oriented programming interface to the Jet 1.1 database engine called Data Access Objects (DAO). Visual Basic 4.0 contains the Jet 3.0 engine, which gives Visual Basic new database features and capabilities. (Visual C++® 4.0 also includes the Jet 3.0 database engine.) Utilizing the Jet engine, Visual Basic-based applications can access three categories of datasources: Microsoft Access MDB files; third-party desktop database files including dBASE, Btrieve, Paradox, and FoxPro; and ODBC data sources.

Jet's native database file format is the Microsoft Access MDB file format. This should not come as a surprise since Microsoft Access uses the Jet engine. Jet supports a variety of external desktop database files as well. Jet can import the external data or it can leave the data in the external format and work with it. Jet also supports ODBC data sources, which usually consist of a back-end SQL server (such as Microsoft SQL Serverª, Sybase System 11, Oracle 7 Server) or any other data store for which an ODBC driver is available. Not only can the Jet engine retrieve and update data located in any of these sources of data, it can also connect to all three types concurrently. Using the Jet engine, it is quite common to fetch and join heterogeneous data from one or more ODBC data sources and desktop database files.

Visual Basic Database Architectures Using the Jet Engine

A Visual Basic database application usually has three parts (see Figure 1). The user interface component consists of the actual forms, controls, and associated code in your application. It accomplishes this by interacting with the Jet database engine, which in turn performs the requested operations on the data store and returns the results to the application. The data store consists of the file or files that contain the actual data.

Figure 1 Single-user Database App

Figure 1 shows the architecture of a single-user, personal database application. Figure 2 shows the configuration of a multiuser remote database system. Here, the clients run the interface (your application), which calls the Jet engine. While this is a multiuser system, it's still a far cry from a true client/server configuration. Every client has its own copy of the Jet engine, and the server is used for nothing more than shared disk space.

Figure 2 Multiuser Remove DBMS

OK, so what if you want a true client/server system? Well, we're almost there-except for one minor problem: Jet is not a true client/server database engine. Luckily, Jet can provide access to a full-featured database server like SQL Server 6.0 via ODBC as shown in Figure 3. Here the Jet engine does not update the physical data store directly. It connects to the ODBC data source, optimizes queries, and acts as a general conduit of data between your application and the real database engine (such as Microsoft SQL Server). This allows one database engine to control access to the data, as opposed to having each client's Jet engine attempt to cooperate with other clients' Jet engines.

Figure 3 Accessing SQL Server via Jet


The simplest way to access data from Visual Basic is via the Jet data control. Applications that use the Jet data control can allow the user to view and update records from any of the supported data sources without writing a single line of code. You do this by using the Jet data control in combination with data-bound controls. You can determine if a control is data-aware (can be bound to a Jet data control) by checking to see if it has DataSource and DataField properties. By setting the DataSource (to the data control's Name property) and DataField (the field in the data control's RecordSource property) properties of a data-aware control, you can bind that control to a specified field in a table. The control (for example, a text box) will display the data from that field in the current record as the user cycles through records in the table. It will also update the record to reflect any changes to the data. (Data controls use the concept of a current record-at any given time, bound controls are displaying data referenced by the current record of the Jet data control.) While these are powerful features, they lack some functionality, such as methods to delete records, process transactions, or create new tables.

Data Access Objects API

To meet the needs of a somewhat more sophisticated database application, the Jet engine exposes an object-oriented programming interface called Data Access Objects (DAO). DAO is a programming interface for the Jet engine, and thus provides full access to the features of the Jet database engine, unlike the Jet data control. In earlier versions, this was not always the case. Improvements to the DAO interface in Jet 3.0 address this issue.

As you can see in Figure 4, DAO closely models the structure and hierarchy of actual database components-everything from the database itself to the tables, fields, indexes, and relationships contained within it. The beauty of DAO is that you are working with objects that refer to components of a virtual database. Your actions upon these objects cause the Jet engine to manipulate the physical data store. The physical database can be an MDB file stored on your local machine or an Oracle server running on a VAX; the code you write is identical.

Figure 4 DAO Hierarchy

The features of DAO can be split into two broad categories: those designed to defineadatabase'sschema, and those designed to work with data in an existing database. The former is sometimes called the Data Definition Language (DDL), and the latter the Data Manipulation Language (DML). These are not different languages, just different operations that DAO supports.

In Visual Basic, the DBEngine object represents the Jet database engine itself. DBEngine contains within it a collection of Workspaces-the first of which, Workspaces(0), is created automatically. A Workspace object is a session in which you can create and open a database object. The following example uses the DDL features of DAO to create an MDB database file:

Dim MyField As Field
Set MyWs = DBEngine.Workspaces(0)
Set MyTable = MyDB.CreateTableDef("MyTableName")
Set MyField=MyTable.CreateField("MyFieldName", dbText)
MyTable.Fields.Append MyField
MyDB.TableDefs.Append MyTable

Writing code that uses the Data Access Objects is not the only way to create an MDB database. You can, for example, use Microsoft Access or Data Manager (a database management and definition tool that comes with Visual Basic) to create the database interactively. Once defined, a database can be opened and manipulated by using the DML features of DAO. The following example uses the OpenDatabase method; notice that the objects defined within the database (tables, fields, and so on) are immediately available once the database is opened:

Dim MyField As Field
Set MyWs = DBEngine.Workspaces(0)
Set MyDB = MyWs.OpenDatabase("MYDBNAME.MDB")
Set MyTable = MyDB.TableDefs("MyTableName")
Set MyField = MyTable.Fields("MyFieldName")

When working with the database "objects" (Database, Workspace, Recordset, and so on), you are actually working with variables that refer to the underlying objects (that is, references). This is why the Set keyword is required; it says, "set this variable to refer to that object." Once a database table has been opened, records can be added or deleted as follows:

' shorthand for MyTable.Fields("MyFieldName")
MyTable![MyFieldName] = "Data in my record!"

It is important to realize that DAO and the Jet data control are not mutually exclusive interfaces. In fact, they are often used together. Although DAO is a powerful programming interface, it provides no facilities for binding a control to the fields of a database table. In contrast, the Jet data control combines sparse programming features with a powerful data-bound control model. DAO and the Jet data control complement each other well. The Database property of the Jet data control exposes the underlying Database object being used. With this and other properties of the Jet data control, you can manipulate the database using DAO.

Using the Jet data control in conjunction with DAO sounds wonderful, particularly when you are accessing data from an MDB file or another supported desktop database file. You get the programmability of DAO with the high-level features of the Jet data control. However, there's a price to be paid for a high-level interface that masks the difference between a local database file and a connection to a remote SQL server. In this case, the drawback is diminished performance when connected to an ODBC data source; there's the extra Jet engine layer. Since most client/server applications written in Visual Basic end up using an ODBC connection to a remote database server, this sluggish performance of ODBC data sources via the Jet engine is a big concern. Many developers have resorted to learning and using the ODBC API directly from Visual Basic to get the increased performance. Using this approach, developers bypass the Jet engine and data control entirely.

Understanding ODBC Architecture

The ODBC API is a set of functions intended for use from C/C++ that define an interface for querying database servers using SQL. While most SQL servers provide an API with which you can connect to the server and pass SQL statements for processing, this restricts you to using that vendor's database server (incidentally, the database vendors don't seem to mind). The beauty of ODBC is that rather than having to learn a database vendor's proprietary API, you can use the ODBC API to access any data source for which an ODBC driver is available. One of the major design goals of ODBC is to provide performance equivalent to that of the DBMS's native API. This means that, if Monday morning your boss decides to switch the entire department's data repository from Oracle to Microsoft SQL Server, all applications using the ODBC API will require virtually no modification. Nevertheless, to use ODBC you must know SQL and the ODBC API. The ODBC architecture can be broken into four components: applications, the driver manager, drivers, and data sources.

In this model, the application is your Visual Basic program. Based on the user's actions, it makes requests of the driver manager via the ODBC API. The driver manager (ODBC32.DLL) is the middleman between your application and the actual ODBC driver. The purpose of the ODBC driver manager is threefold: it loads and unloads the ODBC drivers, reflects ODBC API calls into the appropriate ODBC drivers, and enables the end user to create a Data Source Name (DSN).

I'll explain DSNs first since they're the primary connection vehicles in ODBC. A DSN is actually an alias to an ODBC data source. The idea is that the users should not have to know the database name and location in order to connect to an ODBC data source. Instead, all this information should be collected and stored in one place (the Windows registry), and then simply referred to later via a user-given name-the DSN. The user or administrator can create DSNs via the ODBC icon in the Windows Control Panel. It is also possible to create DSNs programmatically by using the ODBC API ConfigDSN.

Without the driver manager, your application would have to load and unload the appropriate drivers, collect the connection information, and so on. In addition, you can also use the driver manager to load several ODBC drivers at once. For example, several applications on your machine could simultaneously access different data sources requiring different drivers or you could query data from two servers requiring different drivers. This is one of the major reasons for using the driver manager: the driver manager can manage multiple, concurrent connections to multiple ODBC drivers.

An ODBC driver is responsible for:

  • Implementing the ODBC API functions
  • Establishing a connection and submitting requests to the data source
  • Returning results to the application
  • Translating errors into standard error codes
  • Declaring and manipulating cursors
  • Managing transactions

The goal is to insulate an application from the database vendors' proprietary API. As mentioned earlier, the application instead communicates through the ODBC API, the driver manager interprets these calls and loads the necessary ODBC driver, and the driver interacts with the actual data source. Your application can access any data source with a single API.

While ODBC was designed to fit into a client/server environment, it is also flexible enough to handle non-client/server DBMSs. For example, ODBC drivers are available for interacting with MDB files, other desktop database files, text files, and even Microsoft Excel (XLS) files. To deal with these two fundamentally different data sources, ODBC defines different kinds of drivers: one-tier and two-tier.

A one-tier ODBC driver accesses a desktop database file like an MDB or flat text file. The usual configuration of such a system has the database located on the same machine-or tier-as the driver. One-tier drivers usually have relatively large memory footprints since they are a driver and DBMS rolled into one. For example, the MDB ODBC driver contains a special version of the Jet engine. While one-tier drivers aren't all that useful from Visual Basic (since the Jet engine already provides access to MDB), they are crucial if, say, Microsoft Office apps want to access data in a local database.

More common is the two-tier driver, which fits into the classic client/server mold. The driver (client) sends SQL requests to and receives results from the DBMS (server), usually across a network.

The ODBC standard defines three levels of driver conformance: core, level 1, and level 2. The core functions correspond to the functions in the X/Open and SQL Access Group (SAG) specification on which ODBC is based. Levels 1 and 2 are two sets of extended functionality defined by Microsoft. These provide features such as scrollable cursors and asynchronous processing. Each ODBC API function is categorized into one of these three levels. Figure 5 lists the core ODBC functions and their use.

Using the ODBC API

Now let's look at how a Visual Basic application would actually use the ODBC API. This is not as trivial as it might sound because the ODBC API was designed for use primarily by applications written in C or C++. Due to differences in the way languages pass arguments to procedures and store data types (such as strings) in memory, calling the ODBC API from Visual Basic is difficult. In C, for example, procedure arguments are passed by value; in Visual Basic they are passed by reference. Although this can be handled by placing the ByVal keyword in front of parameters when calling a procedure, it can be tricky to determine when this is necessary. As with any other DLL procedures you intend to call from Visual Basic, you must first declare them. Visual Basic comes with a file, ODBC32.TXT, that contains all the declarations for the ODBC API. Either add it to your project, copy and paste the parts you need into your application, or use the Api Text Viewer program.

Like most of the Windows® API, the ODBC API is handle-based; handles are used to reference the objects you work with. ODBC applications work with three handle types: environment, connection, and statement. Every application that uses ODBC begins with the allocation of one environment handle (with SQLAllocEnv) and ends by freeing that environment handle (SQLFreeEnv). The environment handle is the parent handle with which all other ODBC resources allocated for the application are associated.

 Dim rc As Integer      ' Return code for ODBC functions
Dim henv As Long       ' Environment handle
rc = SQLAllocEnv(henv) ' Alloctheenvironmenthandle

Next, the connection handle is allocated. The connection handle manages all information about a connection to an ODBC data source. Notice that the SQLAllocConnect function takes the handle to the environment as the first argument so that it can associate the newly created connection handle with the environment handle.

 Dim hdbc As Long                      ' Connection handle
rc =SQLAllocConnect(ByValhenv,hdbc)' Allocate a   
                                      ' connection handle

It is important to realize that no ODBC driver has been loaded up to this point. We have been interacting with the driver manager only. Once allocated, the connection handle can be used to make a connection to a target data source by using the SQLConnect function, which takes the data source name (DSN), the user's ID, and password as arguments. The driver manager then looks up the information associated with the DSN in the registry retrieving information that includes the name of the required ODBC driver, the name of the server, and the name of the database.

In the following code, we ask ODBC to look up the DSN and to log in. At this stage, the driver manager loads the driver referenced by the DSN. It then calls the SQLAllocEnv and SQLAllocConnect functions in the driver automatically (remember that the driver wasn't loaded when we called these functions in the driver manager). Finally, the driver manager calls the SQLConnect function in the loaded ODBC driver.

 Dim DSN As String, UID As String, PWD As String
DSN = "MyDSN"          ' The data source name
UID = "MyUserID"       ' The user's identification
PWD = "MyPassword"     ' The user's password

The next step depends on the ODBC driver. If the loaded ODBC driver is one-tier, then not much happens at this stage since the driver only has to open a database file and there's no network connection to make. However, if we're referencing a remote database server, then processing the SQLConnect call involves using the network interface installed on the client machine to make a connection to the physical server specified by the DSN that's stored in the registry. This is followed by the user ID and password being validated by the remote server. If all goes well, the application now has a valid connection to the server. The next step is to obtain a statement handle that enables the program to query the remote server. The statement handle, created by the SQLAllocStmt function, is the real workhorse of any ODBC application.

 Dim hstmt As Long              ' Statement handle
rc = SQLAllocStmt(hdbc, hstmt) ' Allocate a statement 
' handle

Finally, we want to pass an SQL query to the ODBC data source. SQLExecDirect takes the statement handle and SQL string as arguments. The SQL server then receives and executes the requested SQL statement, in this case a command to select the LastName field from the Employees table:

 Dim SQLIn As String
SQLIn = "Select LastName from Employees"
   ' Execute the SQL string
rc = SQLExecDirect(hstmt, SQLIn, Len(SQLIn))

When the SQLExecDirect function returns, the results of the query can be received. The SQLFetch function begins this process by returning a row of data from the result set. Then the SQLGetData function actually gets the data from the row returned by SQLFetch and stores it in a string variable called field_value. This continues in a loop until no more rows are available in the result set and the return code of SQLFetch no longer equals SQL_SUCCESS.

 Dim field_value As String * 255
Dim LenRead As Long
field_value = String(255, 0)
Do While SQLFetch(hstmt) = SQL_SUCCESS
MsgBox field_value

An ODBC API application ends by closing the connection to the ODBC data source and freeing the handles:

rc=SQLDisconnect(ByValhdbc)   'Closestheconnection
rc=SQLFreeConnect(ByValhdbc)  'Freestheconnection 
rc=SQLFreeEnv(ByValhenv)      'Freestheenvironment 

Using SQLBindCol with 32-bit Visual Basic 4.0

If you are an experienced ODBC API programmer, you are probably familiar with the intricacies of using advanced ODBC functions such as SQLBindCol and SQLExtendedFetch. If this sounds like you, Visual Basic has got an interesting gotcha that can cause you trouble. In Visual Basic 3.0 and the 16-bit edition of Visual Basic 4.0 everything works as expected:

 Dim ReturnString As String

SQLBindCol binds data from the first column in the result set to the ReturnString variable. The next call, SQLExtendedFetch, fetches one row from the result set and stores it in the variable referenced in the SQLBindCol call (ReturnString). Here's the catch: when using SQLBindCol from 32-bit Visual Basic applications, you must take special care when handling string data types. Due to the manner in which Visual Basic converts strings passed from DLLs into Unicode, you cannot directly bind result-set columns to string or variant variables. Instead, you must bind string-type result-set columns to byte arrays which are not subject to Unicode conversions:

 Dim ReturnBuffer(100) As Byte
rc = SQLBindCol(hStmt, 1, SQL_C_CHAR, ReturnBuffer(0),_ 
100, cbRetStr) rc = SQLExtendedFetch(ByVal hStmt, SQL_FETCH_NEXT, 1,_
cbrow, RowStatus)

While this code correctly retrieves data into the ReturnBuffer byte array, the question now becomes what do to with it. You can't display a byte array using MsgBox or assign it to the Text property of a text box since all these options require a normal string. Based on an example from the Building Client/Server Applications in Visual Basic manual included with the Visual Basic 4.0 Enterprise Edition, I have written a procedure called BytesToString that performs the desired conversion:

Dim Data As String, StrLen As String
Data = StrConv(byte_array(), vbUnicode)
StrLen = InStr(Data, Chr(0)) - 1 ' find first 
' terminating null BytesToString = Left(Data, StrLen) End Function

This function accepts an array of bytes and converts it to a string. The only difference between this version and the one in the Visual Basic documentation is that the original converts the entire byte array into a same-length character string. In our example, we only want the portion of the byte array that contains our data (up to the null terminator). Once we have the data in a string, we can work with it in any way we like.

Now the only problem is how to put it back in the database. Say you read a LastName field from a table and display it in a text box. The user then changes the name and you want to update the table to reflect this. You now have the data as a string, but to update the database you will need to convert it back to a byte array. The following procedure, StringToBytes, automates this task for you.

return_buffer() As Byte) Dim StrLen As Integer, Count As Integer For Count = 0 To Len(Data) - 1 return_buffer(Count) = Asc(Mid(Data, Count + 1, 1)) Next Count For Count = Len(Data) To ByteLen return_buffer(Count) = 0 Next Count End Sub

You can now call SQLSetPos with the SQL_UPDATE flag to save the data back to the database.


While the ODBC API is a powerful and efficient method of accessing database servers, using it from Visual Basic is rather involved. The syntax is convoluted with the awkward ByVal argument-passing requirement and data type conversions. In addition, the ODBC API provides no facility for data-bound controls, which means that you'll have to write all the code to deal with the user interface. Nevertheless, programmers still use ODBC because its performance advantageovertheJetengineissogreat.FormoreinformationseeKyleGeiger'sInsideODBC(MicrosoftPress,1995).

Remote Data Objects

During the two years between the releases of Visual Basic 3.0 and 4.0, Microsoft apparently realized that, while the Jet engine is feature-rich and powerful, it tries to abstract database programming to the point that it masks the differences between desktop database files and remote SQL servers, and this imposes too big a performance penalty. They also realized that the ODBC API is simply too cumbersome and difficult for most Visual Basic developers. So Microsoft developed Remote Data Objects (RDO). Like the Data Access Objects of the Jet engine, RDO is an object-oriented database programming interface. Unlike Jet, RDO is not a database engine but rather a thin layer of code implemented on top of ODBC. In other words, RDO is Visual Basic's interface to the ODBC API, and, like the ODBC API, it uses SQL statements to access and query ODBC data sources. While RDO can access any ODBC data source (desktop database files, back-end SQL servers, and so on), it has been optimized to take advantage of sophisticated query engines in products such as Microsoft SQL Server 6.0 and Oracle 7 Server.

There are five main advantages to using RDO when compared with the ODBC API:

  • You don't need to learn how to use the ODBC API from Visual Basic
  • Performance approximates that of the ODBC API
  • RDO is a native Visual Basic programming interface
  • Remote data control and data-bound control models are available
  • The ODBC API can be used in conjunction with RDO

Figure 6 shows the close relationship between the ODBC API and RDO. The three handle types of the ODBC API (the environment, connection, and statement handles) have corresponding object types in RDO (rdoEnvironment, rdoConnection, and rdoResultset). Although it's an oversimplification, you could say that RDO is to the ODBC API what MFC is to the Windows API. Take note that RDO and ODBC database terminology is somewhat different than that of Jet; in RDO, a field is called a column and a record is called a row.

Figure 6 Relationship Between ODBC and RDO

Perhaps we should digress for a moment and discuss the concepts of containers and collections, which are new in Visual Basic 4.0. rdoEngine sits at the top of the object hierarchychartinFigure6. It is a container object and containstherdoEnvironmentandrdoErrorobjects.Acontainer's objects are referenced with the usual dot notation:


The rdoEngine object actually contains a collection of rdoEnvironment objects, called rdoEnvinronments. A collection is a named group of related components-basically an array of objects of the same type. A member of a collection is referenced with the normal array notation:


where n is the index into the collection.

The rdoEngine object is the top-level container for all other objects in the RDO hierarchy. There is no need to create the rdoEngine object, as it is predefined and only one can exist in an application. The rdoEngine object can be used to create environment objects (with the rdoCreateEnvironment method) or to register an ODBC data source (using the rdoRegisterDataSource). One initial environment is created by the rdoEngine object automatically; it can be referenced as the rdoEnvironments(0) object contained by rdoEngine.

 Dim myEnv As rdoEnvironment
Set myEnv = rdoEngine.rdoEnvironments(0)

Additional rdoEnvironment objects can be created with the rdoCreateEnvironment method of the rdoEngine object. These new rdoEnvironment objects then become part of the rdoEnvironments collection.

 Dim myEnv As rdoEnvironment
Set myEnv=rdoEngine.rdoCreateEnvironment("MyEnvName",_ "UserName", "Password")

Environments can also determine transaction scope. You use the BeginTrans, CommitTrans, and RollbackTrans methods of an rdoEnvironment object to control transactions within that environment (all open rdoConnection databases in the environment are affected). Once you have a valid rdoEnvironment object, the next step is to open a connection with an ODBC data source by using the OpenConnection method; this establishes a physical link to the data source. The OpenConnection method accepts the parameters shown in Figure 7.

The following code ignores the dsName parameter. Instead, it supplies all the connection information as part of the connect string, and connects to an ODBC data source with an existing data source name of MyDSN:

 Dim myCon As rdoConnection
Dim Connect As String
DATABASE=MyDB;" Set myCon = myEnv.OpenConnection("", rdDriverPrompt,_
False, Connect)

To query and retrieve rows from the data source, you need an rdoResultset object created by calling the OpenResultset method. Two RDO objects support this method: rdoConnection and rdoPreparedStatement. Using the rdoConnection object is a good idea if you want a one-time, ad-hoc query that you do not intend to execute again (for example, a query entered by the user). Another option is to create an rdoPreparedStatement object using the CreatePreparedStatement method of an rdoConnection object. This method is recommended if you are creating a query that you intend to run multiple times. The rdoPreparedStatement object even lets you create queries with parameters that are substituted in the query before being run (more on this shortly).

 Dim myRes As rdoResultset
Dim myPs As rdoPreparedStatement
Set myPs = myCon.CreatePreparedStatement("MyFirstPs", _
"Select LastName from Employees") Set myRes = myPs.OpenResultset()

Defining RDO Cursors

A cursor is a logical set of records managed by the data source or ODBC driver manager on the client machine. The type of rdoResultset cursor created by the previous code is the default, forward-only type. The other types available (depending on the capabilities of the ODBC driver) are static, keyset, or dynamic. Forward-only result sets retrieve data quickly and with the least overhead. This type of result set exposes a single record at a time only, and you can only move forward in the result set. One other drawback is that rows in the result set are not refreshed as they change on the data source. Nevertheless, it may be more efficient to rebuild a forward-only result set than to use one of the other types. The main difference between a forward-only result set and a static cursor (similar to the Jet Snapshot-type Recordset object) is that you can move both forwards and backwards in the result set. A keyset-type rdoResultset (similar to a Jet Dynaset-type Recordset object) is scrollable and rows reflect changes made after the result set was created. A dynamic-type rdoResultset is identical to the keyset-type except that membership is not frozen; this means that RDO constantly checks to see if new or updated records on the data source meet the restrictions of the query. Due to this checking, this type of cursor carries the largest overhead. Figure 8 summarizes the four types of rdoResultset cursors.

Some data sources support server-side cursors (for example, Microsoft SQL Server 6.0). If so, the remote database engine builds cursor keysets on the server. Server-side cursors reduce the memory and disk space required on client machines, but shift that load to the server. You can set the rdoEnvironment object's CursorDriver property to control whether cursors are created and maintained on the client or server machine (see Figure 9).

Traversing and Editing Rows with RDO

At any time, only one row in a cursor is exposed for data retrieval or modification-the row referenced by the current row pointer. You can change the current row by using methods such as Move, MoveNext, MovePrevious, MoveFirst, or MoveLast, or by setting the AbsolutePosition or PercentPosition properties of an rdoResultset object. You can also modify data in the current row using the Edit method, assuming the Updatable property of the rdoResultset is set to True. Then, set the column values as desired and use the Update method to save the row back to the data source. If you change your mind, you can use the CancelUpdate instead of the Update method to abandon the edits. You might also perform add and delete rows, which can be accomplished with the AddNew and Delete methods.

Do Until myResultset.EOF
   if(myResultset.Columns("LastName").Value ="Eddon"   Then
End If

Instead of using the Edit, AddNew, Delete, and Update methods, you can use the Execute method to carry out an SQL query that contains one or more UPDATE, INSERT, or DELETE statements. The Execute method is designed to execute action queries that do not return rows. You can also control how an rdoResultset object created by the OpenResultset method manages concurrency by setting the LockType argument to one of the flags shown in Figure 10.

Processing Multiple Result Sets

Some queries you execute will return multiple result sets. The following example is made up of three parts: two SELECT clauses, which return columns from tables, and one UPDATE statement, which will not return any rows.

 Dim myRes As rdoResultset
Dim myPs As rdoPreparedStatement
Set myPs=myCon.CreatePreparedStatement("MySecondPs", _
"Select LastName from Customers; Select City_
from Distributors;
Update Employees Set Salary=50000WhereName=_
'Flinstone' ") myPs.RowsetSize = 1 Set myRes = myPs.OpenResultset(rdOpenForwardOnly)

The previous code creates the query and then executes it by calling the OpenResultset method. This opens the first result set, corresponding to the first Select statement, which returns the LastName column of all rows in the Customer table. The following code executes a loop that displays the first column (column 0) of the result set in a message box.

 Do Until myRes.EOF
       MsgBox myRes(0)

When you are finished working with data from the first result set, you can activate the following one by executing the MoreResults method (after which the first result set will no longer be available). The following code selects the next result set and then loops through and displays all rows where the city name begins with the letters A through D.

 If (MyRes.MoreResults) Then
Do Until myRes.EOF and MyRes(0) < "E"
       MsgBox myRes(0)

You're now ready to process the last result set of the query, which was generated from an Update statement that does not return rows. Nevertheless the result set must be processed. The only potentially useful information returned in this result set is available via the RowsAffected property of the rdoResultset object, which is set to the number of rows that were affected by the query.

 If (MyRes.MoreResults) Then
MsgBox myPs.RowsAffected & " rows updated."
End If

Creating Parameter Queries

Sometimes you may want to execute a query several times and use different parameters each time. For example, you might have a query in your application that returns the last names of people making $30,000 or more ("Select LastName from Employees Where Salary >= 30000"). However, depending on user input, you might want to change the monetary criteria; this, in turn, determines a row's membership in the result set. One possibility is to create and execute a new query with differing amounts every time. This would be slow and inefficient because the SQL server would need to recompile the query every time it was executed. Much more efficient would be to create a prepared statement specifying the salary criteria as an unknown value: "Select LastName from Employees Where Salary >= ?". The question mark is the symbol for an unknown. For every ? in the SQL string, an rdoParameter object is created automatically and added to the rdoParameters collection. This enables you to assign values to the parameters by referencing the object in the collection using the rdoParameters(n) notation. For example, the first parameter in an SQL string is rdoParameters(0), the second parameter is rdoParameters(1), and so on. The missing values must be assigned before executing the query with the OpenResultset method.

 Dim myRes As rdoResultset
Dim myPs As rdoPreparedStatement
Set myPs = myCon.CreatePreparedStatement("MyThirdPs", _
"Select LastName from Employees Where Salary >= ?") myPs.rdoParameters(0).Value = "50000" Set myRes = myPs.OpenResultset()

To execute the query again with a different parameter value, you could simply change the value of the rdoParameter object and use the Requery method:

 myPs.rdoParameters(0).Value = "100000"

When executing an action query that will call a stored procedure on the server, you might want to receive a return value from the procedure. In this case, you need to set the rdoParameter's Direction property. In the following hypothetical case, we will call a stored procedure, sp_add, which takes two parameters and returns their sum.

 Dim myPs As rdoPreparedStatement
Set myPs=myCon.CreatePreparedStatement("MyFourthPs", _
"{ ? = call sp_add (?, ?) }") myPs.rdoParameters(0).Direction = rdParamReturnValue myPs.rdoParameters(1).Value = "5" myPs.rdoParameters(2).Value = "3" myPs.Execute ' Execute the stored procedure query MsgBox "5 + 3 = " & myPs.rdoParameters(0).Value

In this case, you can see that we have created a prepared statement with three parameters. We set the value of the first parameter's Direction property to rdParamReturnValue; this indicates a return value. The second and third parameters' Direction property wasn't set because rdParamInput is the default. Next, we used the Execute method to actually run this query, instead of the OpenResultset method used in previous examples, because this action query does not return a set of rows (result set). After execution, the first parameter's Value property contains the value returned by the sp_add stored procedure-in this case 8. It is worthwhile to note that Microsoft SQL Server 6.0 creates temporary stored procedures for prepared statements; these temporary stored procedures are cleaned up automatically if the connection to the server is broken abnormally.

Controlling Asynchronous Operations

Normally, when you call the OpenResultset method of an rdoPreparedStatement object, your application's execution is suspended until the query is completed. It is possible to run the query asynchronously; this means that execution focus will be returned to your application immediately, even though the server has not yet fulfilled the query. To turn on asynchronous operation, pass the rdAsyncEnable flag to the OpenResultset method as shown in the following code:

 Dim myRes As rdoResultset
Dim myPs As rdoPreparedStatement
Set myPs = myCon.CreatePreparedStatement("MyFifthPs", _
"Select * from MyTable") Set myRes = myPs.OpenResultset(,, rdAsyncEnable)

You cannot yet begin to use the new rdoResultset object, as it has not actually been created yet. Test the StillExecuting property to determine when the new object is valid for use. CalltheCancelmethodtoabortbeforethequeryiscompleted.

 Do While MyPs.StillExecuting
       Print "Processing..." 

Using Handles with RDO and the ODBC API

One of the features that makes RDO very extensible is thatits objects expose the underlying ODBC handles, similar to the way in which Visual Basic form objects have an hWnd property that exposes the actual window handle. Figure 11lists the ODBC handles that are available through RDO.

These handles can be used to make direct calls to the ODBC API from Visual Basic, creating a type of hybrid program that uses both RDO and the ODBC API. Be aware that incorrect use of the ODBC API on RDO handles (for example, closing a connection handle using the ODBC API and then trying to use the rdoConnection object) can cause unpredictable behavior.


While RDO is powerful and efficient, it cannot use data-bound controls. To support the Visual Basic data-bound control model, you need the Remote data control. You can bind standard data-aware controls to the Remote data control. The data-bound controls can't tell the difference between the Jet data control and Remote data controls, which look and behave like the Jet data control. But there is a big difference between them: the Jet data control is obviously implemented using the Jet database engine, while the Remote data control is implemented by calls to the ODBC API, giving you a powerful data-bound control model without the Jet engine. In addition, just as you can combine the Jet data control with DAO, the Remote data control can be combined with RDO code.

If you are designing a new client/server database system and planning to use Visual Basic for the development of the client-side application, we certainly recommend RDO and the Remote data control as the way to go. If you have an existing application that uses the ODBC API, then the effect RDO has on your project should depend on how comfortable you feel working with the ODBC API. If you do decide to convert to RDO, you can take comfort in the fact that the underlying ODBC API handles are always available. Lastly, if your existing project uses DAO to access ODBC data sources, chances are good that you will want to migrate it to RDO. Depending on which services of the Jet engine your application uses, converting existing DAO programs to RDO is normally not very difficult. Figure 12 lists the DAO objects and their RDO counterparts. Figure 13 compares the features of the different data access programming models available in Visual Basic.

Remote Automation

Remote automation is a special Visual Basic feature that extends the existing OLE automation model so that it supports the separation of the client and server applications by a network. The technical aspects of how this is accomplished can be found in "Visual Basic 4.0 Provides Easy Entry to the Advanced Features of Windows 95," by Joshua Trupin (MSJ October 1995). The current fashion in client/server architecture is to have many clients to one server. While this is a type of distributed computing, it is very primitive. This problem especially affects database-application developers in the design of client-side programs. The server is often simply a SQL server acting as a data repository. The client, however, is where all the "business logic" (the code that actually understands what the data means and how to interact with it) exists. For example, if you have many client applications in your department, and they all query a common database server, it is possible that many of these client applications have redundant code designed to handle the complex business logic required by your company's data management.

A much more flexible model is the partitioned system where two or more parts work together to provide a service to the user. It is not even necessary that these parts run on separate computers, although that is common. One approach becoming popular for database applications is the three-tier or n-tiered architecture. In this model, you have a client and a database server, and you also introduce a middle layer between the two that acts as the "business" or intelligent application server. You can then simplify the client application by moving the business logic of your system to the application server, which might reside on the same computer as the database server (or it might not). This often has the effect of speeding development and lowering distribution costs, not to mention making the code more maintainable. The application server will then make requests of the database server on the client's behalf. This not only simplifies the development and maintenance of the clients, but also means that you have to update only one application when the company's business logic changes.

Perhaps it will make this concept easier to understand if we provide a real-life example. We recently worked on a Windows-based software package called MarvelTrak (to protect the guilty, the name of the product has been changed) for a large transportation firm, which provides package tracking capabilities to the firm's customers. When you ship a package from Los Angeles to New York, you are given a special tracking number. Then if you want to know what has happened to your package, you simply load MarvelTrak and type in the tracking number. The software dials into the company's mainframe and retrieves information about the current whereabouts of your package. For example, it might tell you that the package is currently in a shipping hub, or that it has already been received and by whom. An electronic version of the recipient's signature might even be available.

So far this sounds like a fairly standard depiction of a client (the customer's PC) and server (the company's mainframe) project. The problem has to do with the type of tracking number you receive. Different types of numbers denote the different types of services a customer purchased (like the ability to view the recipient's signature), the package's origin, and so on. To make MarvelTrak as user-friendly as possible, we did not want the customer to need to interpret the type of number he/she had received. We wanted to take the tracking number and just pass it to the mainframe, where based on certain business rules the type of number could be determined and the appropriate information provided. However, due to internal politics relating to the people managing mainframe software development, this seemingly innocuous request was deemed completely undoable. For a while, we toyed with the idea of implementing this simple logic on the client side, but decided against it for the simple reason that, as new types of services were introduced, the software would not recognize the new services and would need to be updated. Since the software is currently being used by more than 50,000 customers, this would be no small upgrade (although perhaps not quite on the scale of the current Windows 95 upgrade cycle). Therefore we ended up with the less-than-ideal method of requiring the end-users to interpret their tracking number themselves, and enter it into the appropriate dialog box in the application.

A more elegant solution in this case might be to create an application server, running Windows NT Server 3.51, at the company's data center. This application server would have all the business-logic code to recognize the various types of tracking numbers. The Windows 95 front-end would simply dial into the application server and send it the tracking number entered by the customer. The MarvelTrak application server would then interpret its type and pass it, along with the appropriate request code, to the mainframe. This would effectively eliminate the update problem, since only the application server would need to be updated if the types of services offered changed.

Setting up an existing application for distributed computing involves several steps. The following example assumes you want to use remote automation to control Microsoft Word for Windows® 95 from Visual Basic. The Visual Basic application is a very simple program that creates an instance of the Word.Basic object. If you want to try these features yourself, first test running the Visual Basic application and Word on the same computer. The code for the Visual Basic application should look something like this:

 Dim Word As Object
Set Word = CreateObject("WORD.BASIC")
Word.CharColor 6
Word.FontSize 20
Word.Insert "Hello Word, this Visual Basic calling!"
Word.FileSaveAs "C:\AMAZING.DOC"

After you test this code and verify that it works, then execute the following steps and you will have a remotely automated Word application. Run the Automation Manager utility on the server computer. After startup and initialization it will minimize itself. Leave it running since this is the RPC module of remote automation that listens for client requests. Run the Remote Automation Configuration Manager, click the Client Access tab, and select the Word.Basic OLE class. Click Allow All Remote Creates, then close the Remote Automation Configuration Manager.

On the client computer, run the Remote Automation Configuration Manager, click the Server Connection tab, and select the Word.Basic OLE class. Set the network address to the client computer name and set the network protocol to a protocol that is installed on both the client computer and the server computer. From the Register menu, select Remote, then click Apply. Finally close the Remote Automation Configuration Manager.

Now run the Visual Basic application on the client and watch as it connects to Word on the remote computer. You only need to configure it once. Once configured, the client computer always looks to the server for the Word.Basic object. In the future, these tools and connection utilities will no doubt be simplified and perhaps done away with altogether as they are built into the operating system.

The VBDB Timing Application

The VBDB timing application is a 32-bit application written using Visual Basic 4.0 Enterprise Edition (see Figure 14). The program compares the three major data access methods of Visual Basic discussed in this article: DAO, RDO, and ODBC. All three methods are used sequentially to query an ODBC data source that you specify by referencing its DSN. The application also asks you for the user ID and password necessary to connect (you can leave these blank if they are not required by your data source).

VBDB connects and queries the database via DAO to determine what tables and fields are available on that data source. Choose a single field from a table on which the tests will be run. You can decide whether you want to test reading records only or to test both reading and writing records on the server. If you tell VBDB to test both reading and writing, it firsts reads a record and then reverses the data and writes it back. If you display the data on the server at this point, you will find that it has been reversed. We do not recommend running this test against important data; you may want to create a sample database for the test. If you then run the program again it will reverse the data back to its original form.

The tests begin by using DAO, then RDO, and then the ODBC API. When finished, the results allow you to compare the access speed of each method. Figure 15 shows the timing results received when running this application under Windows 95 on a 66MHz Pentium connected over a network to Microsoft SQL Server 6.0 running Windows NT 3.51 on a 90MHz Pentium. The tests were run on the au_lname field of the authors table of the Pubs database that comes with Microsoft SQL Server. You can see that both RDO and the ODBC API left DAO in the dust. The unexpected result is that RDO outperformed the ODBC API by a small margin when reading and writing records and a larger margin when only reading records. We believe the explanation for this discrepancy lies in the way VBDB reads records using the ODBC SQLExtendedFetch function. VBDB calls SQLExtendedFetch to read one record at a time so as to mimic the way records are read using DAO and RDO. It would be more efficient to read a batch of records with each call to SQLExtendedFetch. Most likely, RDO is doing this type of intelligent caching automatically, and thereby improving the speed of RDO. This logic is congruent with the smaller timing difference recorded when both reading and writing records. This type of read-ahead caching cannot be done when writing data back to the server (since RDO has no idea what we intend to write). Even when reading and writing data, RDO still comes out ahead due to the read caching.

To test the remote automation features discussed in this article, we modified the VBDB sample to work across a network. Once VBDB was working reliably as a client/server system, where the client was the VBDB program and the server was an ODBC data source, we wanted to distribute the program. The plan was to create a three-tier architecture where the first tier would be the VBDB front-end, the second tier would be the VBDB timing server, and the third tier would be the ODBC data source. The first step was to separate the class modules from the VBDB front-end and to move them into the VBDB timing server. We also added a user interface to the timing server for monitoring purposes only. This strategy worked well; no changes were necessary to the client, and only a few changes were needed on the timing server for the extra form for monitoring purposes. Once we got it working, we first tested the system with all three components on the same machine, and then proceeded to try various configuration including one with each component on a separate machine.

At this stage, there was no point in reporting timing values because we had introduced too many variables: the app was partitioned across three machines. Besides, our objective was not to measure the speed of remote automation, RPC, or the network setup. We simply wanted to prove to ourselves that it could be done and to find out how much work was involved. It was relatively painless.

From the March 1996 issue of Microsoft Systems Journal.

© 2017 Microsoft Corporation. All rights reserved. Contact Us |Terms of Use |Trademarks |Privacy & Cookies