You are on page 1of 15

Introduction

Before I joined Blue Fish Development Group this past year, I had never worked with Documentum. So, my first task was to familiarize myself with the basics of the Documentum Foundation Classes (DFC). Documentum defines DFC as a set of Java classes that make essentially all EDM server functionality available to client programs through a published set of interfaces. It may also be described as the Object-Relational-Mapper (ORM) for programmatically accessing and manipulating all of the objects in a docbase. Whenever Im learning a new application framework, I typically look for a Quick Start or Getting Started article either in the official documentation or by performing a Google search. If Im unable to find one, then I dive straight into whatever documentation is provided with the intent of creating a set of programming exercises which will essentially become my own Quick Start guide going forward. Since a docbase is a persistent store, this meant I needed to familiarize myself with the basic CRUD functions: Create, Read, Update, and Delete. Furthermore, since Documentum is a document management system (to say the very least), I decided to explore the basic file system operations of the following:
Create folder Create file Link file to folder Modify file Fetch folder contents Query files by attribute (name and author) Delete file Delete folder

Finally, since I was unable to locate that Quick Start guide I was searching for, I decided to capture what I learned in the article presented here.

Setting up a DFC project in Eclipse


The complete Eclipse project containing all sample code can be found HERE.

Setting up a basic DFC project in Eclipse was a straightforward task. After installing DFC 5.3 and creating a new Java project in Eclipse, perform the following steps:
From your Documentum install dir, add dfc.jar, dfcbase.jar, and log4j.jar to your

projects Build Path.


Create a new folder within your project and link it to the file system folder <dctm-

install>\config.
Add your new linked config folder to your projects Build Path.

Hello World: Logging into a Docbase


After setting up my Eclipse project, I wanted to verify that I could programmatically access my test docbase. I wrote the following JUnit TestCase, which would serve as the base class for all subsequent TestCases I might write. Its main purpose is to authenticate to our test docbase and obtain an IDfSession object for our tests to use.

package com.bluefish.dfc.test;

import junit.framework.TestCase;

import com.documentum.fc.client.DfClient; import com.documentum.fc.client.IDfClient; import com.documentum.fc.client.IDfSession; import com.documentum.fc.client.IDfSessionManager; import com.documentum.fc.common.DfLoginInfo; import com.documentum.fc.common.IDfLoginInfo;

public class Dfc5BaseTest extends TestCase {

// TODO: refactor to pull from a properties file

private static final String DOCBASE = "YOUR DOCBASE"; private static final String USERNAME = "YOUR USERNAME"; private static final String PASSWORD = "YOUR PASSWORD";

private IDfSessionManager sessionMgr = null; protected IDfSession session = null;

protected void setUp() throws Exception { super.setUp();

IDfClient client = DfClient.getLocalClient(); sessionMgr = client.newSessionManager();

// Setup login details. IDfLoginInfo login = new DfLoginInfo(); login.setUser(USERNAME); login.setPassword(PASSWORD); login.setDomain(null); sessionMgr.setIdentity(DOCBASE, login);

session = sessionMgr.newSession(DOCBASE); }

protected void tearDown() throws Exception { super.tearDown(); if (session != null) { sessionMgr.release(session); } }

protected void log(String message) { System.out.println(message); }

And then I tested my login code with the following subclass:

package com.bluefish.dfc.test;

public class LoginTest extends Dfc5BaseTest {

public void testLogin() throws Exception { // login happens in setUp(), so nothing to do here assertNotNull("session is null", session); }

There are a couple of important points regarding the above code samples:
Be sure always to clean up your IDfSessions when youre finished with them. If available, always use the session manager to access and release sessions.

Prior to DFC 5.x, there wasnt an IDfSessionManager and the developer was required to callIDfSession.disconnect() whenever a session was no longer needed. However, the IDfSessionManager supports session pooling, so it is critical that any session acquired through a session manager is released through that session manager as well. Otherwise,

bad things can happen and probably will. This is typical Object/Relational Mapper design, so those familiar with a similar persistence framework should find the transition rather painless.

Overview of Documentum Foundation Classes (DFC)


DFCs object model for managing docbase objects is a deep and complex hierarchy, but we can get started with the basics by looking at only a small subset of these classes: *Arrows represent object inheritance levels.

IDfClient IDfSessionManager IDfSession IDfQuery IDfTypedObject --> IDfCollection IDfPersistentObject --> IDfSysObject --> IDfFolder IDfDo cument

Weve already been introduced to IDfClient, IDfSessionManager, and IDfSession in the previous section. So what are the remaining classes used for? The DFC Javadoc describes them as follows:

Class

Description This interface provides functionality to establish and manage sessions with a Documentum server, and provides information about the server before a session is established. This interface provides access to collection objects. This class provides the functionality for the client to interact with dm_document objects in the repository. This interface provides access to folder-related data stored in folder objects.

IDfClient

IDfCollection IDfDocument IDfFolder

Class IDfPersistentObject IDfQuery IDfSession IDfSessionManager IDfSysObject IDfTypedObject

Description This interface extends IDfTypedObject and is the base class for all Documentum persistent objects. This interface provides functionality for running queries against a repository. This interface encapsulates a session with a Documentum repository. Manages identities, pooled sessions, and transactions. This class provides the functionality for the client to interact with dm_sysobject objects in the repository. This interface provides basic operations for all typed objects.

Well get a better understanding once we see them in action, so lets put them to use.

CRUD: Create, Read, Update, and Delete


Finally, its time to do what we all love: Write code. Lets revisit our chosen exercises:
Create folder Create file Link file to folder Modify file Fetch folder contents Query files by attribute (name and author) Delete file Delete folder

Ive created a single test case class, DfcCrudTest.java, with test methods present for each of our exercises. For some of our exercises, there turned out to be more than one viable way of accomplishing our goal. For example, to obtain a folders contents, you can perform a simple DQL query, or if you have a handle on the IDfFolder object, you can call the getContents(..) method on the folder object. To demonstrate this, I included both options within my testFolderContents() method.

Please keep in mind that these tests are written for clarity, not for optimal design.

package com.bluefish.dfc.test;

import com.documentum.fc.client.DfQuery; import com.documentum.fc.client.IDfCollection; import com.documentum.fc.client.IDfDocument; import com.documentum.fc.client.IDfFolder; import com.documentum.fc.client.IDfQuery; import com.documentum.fc.common.IDfId;

public class DfcCrudTest extends Dfc5BaseTest {

private static String DIR_NAME = "Subdir"; private static String DIR_PATH = "/Temp/" + DIR_NAME; private static String FILE_NAME = "Getting Started with DFC and DQL.txt"; private static String FILE_PATH = DIR_PATH + "/" + FILE_NAME; private static String DOC_AUTHOR = "Steve McMichael";

private IDfFolder folder; private IDfDocument document;

public void testSimpleDfc() throws Exception {

initialize();

// tests are order dependent

createFolder(); createFile(); linkFileToFolder(); modifyFile(); fetchFolderContents(); queryFiles(); deleteFile(); deleteFolder();

private void createFolder() throws Exception { log("** Testing folder creation");

folder = (IDfFolder) session.newObject("dm_folder"); folder.setObjectName(DIR_NAME); folder.link("/Temp"); folder.save();

log("created folder " + folder.getId("r_object_id")); assertEquals("unexpected folder path", DIR_PATH, folder.getFolderPath(0)); }

private void createFile() throws Exception { log("** Testing file creation");

document = (IDfDocument) session.newObject("dm_document"); document.setObjectName(FILE_NAME);

document.setContentType("crtext"); document.setFile("E:/clipboard.txt"); // add content to this dm_document document.save();

log("created file" + document.getId("r_object_id")); }

private void linkFileToFolder() throws Exception { log("** Testing file linking to folder");

document.link(DIR_PATH); document.save();

log(FILE_PATH); assertNotNull("unexpected folder path", session.getObjectByPath( FILE_PATH)); }

private void modifyFile() throws Exception { log("** Testing file modification");

document.checkout(); int numAuthors = document.getAuthorsCount(); document.setAuthors(numAuthors, DOC_AUTHOR); //doc.checkin(false, "Prevents promotion to CURRENT"); document.checkin(false, null); // When a null version label is provided, // DFC automatically gives the new version

// an implicit version label (1.1, 1.2, etc.) // and the symbolic label "CURRENT". }

private void fetchFolderContents() throws Exception { log("** Testing folder contents");

// (1) Fetch using IDfFolder object

IDfFolder folder = session.getFolderByPath(DIR_PATH); assertNotNull("folder is null", folder); IDfCollection collection = null; IDfDocument doc = null; int count = 0; try { collection = folder.getContents("r_object_id"); while (collection.next()) { count++; IDfId id = collection.getId("r_object_id"); doc = (IDfDocument) session.getObject(id); log(id + ": } } finally { // ALWAYS! clean up your collections if (collection != null) { collection.close(); } } " + doc.getObjectName());

assertEquals("wrong number of files in folder", 1, count); assertEquals("unexpected doc name", FILE_NAME, doc.getObjectName());

// (2) Fetch using DQL folder(..)

String dql = "SELECT r_object_id, object_name from dm_document where folder('"+DIR_PATH+"');";

// Or we can fetch the contents of our folder and all of its subfolders using // // // // But since we haven't added any subfolders, this will return the same set of dm_documents. // // String dql = "SELECT r_object_id, object_name from dm_document where folder('"+DIR_PATH+"', descend);"; folder('/Temp/Subdir', descend)

IDfQuery query = new DfQuery(); query.setDQL(dql); collection = null; String docName = null; count = 0; try { collection = query.execute(session, IDfQuery.DF_READ_QUERY);

while (collection.next()) { count++; String id = collection.getString("r_object_id"); docName = collection.getString("object_name"); log(id + ": } } finally { // ALWAYS! clean up your collections if (collection != null) { collection.close(); } } " + docName);

assertEquals("wrong number of files in folder", 1, count); assertEquals("unexpected doc name", FILE_NAME, docName);

private void queryFiles() throws Exception { log("** Testing file query");

// (1) load by path

IDfDocument doc = (IDfDocument) session.getObjectByPath(FILE_PATH); assertNotNull("null doc returned", doc); assertEquals("unexpected doc name", FILE_NAME, doc.getObjectName());

// (2) load by query

// NOTE: Authors is a "repeating attribute" in Documentum terminology, // meaning it is multi-valued. So we need to use the ANY DQL keyword here. doc = null; String dql = "SELECT r_object_id" + " FROM dm_document" + " WHERE object_name = '" + FILE_NAME + "'" + " AND ANY authors = '" + DOC_AUTHOR + "'";

IDfQuery query = new DfQuery(); query.setDQL(dql); IDfCollection collection = query.execute(session, IDfQuery.DF_READ_QUERY); try { assertTrue("query did not return any results", collection.next()); doc = (IDfDocument) session.getObject(collection.getId("r_object_id")); } finally { // ALWAYS! clean up your collections if (collection != null) { collection.close(); } }

assertNotNull("null doc returned", doc);

assertEquals("unexpected doc name", FILE_NAME, doc.getObjectName()); }

private void deleteFile() throws Exception { if (document != null) { log("** Testing file deletion"); document.destroyAllVersions(); } }

private void deleteFolder() throws Exception { if (folder != null) { log("** Testing folder deletion"); folder.destroyAllVersions(); } }

private void initialize() { // If something bad happened during the previous run, this will // make sure we're back in a good state for this test run. try { session.getObjectByPath(FILE_PATH).destroy(); } catch (Exception e) { // ignore } try { session.getObjectByPath(DIR_PATH).destroy(); } catch (Exception e) {

// ignore } } }

If you have your DFC Javadoc handy, then the above code sample should provide the details required to tie everything together. However, there is one requirement Id like to highlight. Whenever you execute a DQL query in DFC, an IDfCollection object is created as a handle to the query results, similar to a ResultSet in JDBC. This collection represents an open resource which must be closed. There are a limited number of collections available, and so it is imperative that collections be closed when they are no longer in use. So the two best practices weve discussed regarding resource cleanup with DFC are:
Release your IDfSession objects to their parent IDfSessionManager when youre

through with them. If you obtained your session directly from an IDfClient instead of through a session manager, then be sure to disconnect your session when you are finished.
Close your IDfCollection objects when you are through with them.

Conclusion
Hopefully, this is enough to get you started. There are numerous resources available which provide a deeper dive into some of the concepts presented here. To help you out, Ive provided a short list of references for further reading. Enjoy!

You might also like