Cannot access EM console after installing SOA Suite 11g PS2

After a new install of Oracle SOA Suite 11g PS2, the server starts up without any errors in logs. However after start up, I was unable to access the enterprise manager console at http://localhost:7001/em due to 404. Interestingly, I was able to access the WebLogic administration console without any issue.

I have installed SOA Suite 11g before on my local machine and have never had any issue like this before. So what could be the problem?

The answer is pretty easy and can also be found here: http://forums.oracle.com/forums/thread.jspa?threadID=984020. I just thought about documenting on my blog as well….

For the EM to be present in the domain, you have to create your domain with Enterprise Manager template selected. I have forgotten that in my installation. To check if you have EM available:

  1. Login to your WLS Admin Console
  2. navigate to Deployments.
  3. Check for an application named ’em’. If it is not there, then this is the issue!

If you don’t have EM available, then you can easily extend your domain by following these steps:

  1. Stop your servers in the domain
  2. Invoke the configuration wizard ($ORACLE_HOME/common/bin/config.cmd or config.sh)
  3. Choose Extend Domain option. Select your domain
  4. Select Enterprise Manager template

     image

  5. Complete the wizard
  6. Start your server

Now you will be able to access EM, using a URL like http://localhost:7001/em.

Advertisements

Oracle Service Bus 11g and DB Adapter – Part II: Using an Inbound Database Adapter

Update 15.8.2010: Just uploaded the video for this blog article.

In Part 1 of this blog article series I presented how to use the Database Adapter with Oracle SOA Suite 11g in an outbound scenario. I showed a way to keep the JDeveloper project required to define the Database Adapter wrapped inside the Eclipse OSB project. That will become handy when extending the use case as presented in Part 2 now.

Extended Use Case

In this article I will extend the use case from Part 1 by an Inbound Database Adapter, which should poll the database for changes.  The extended scenario is shown in the image below using the notation from the Integration Blueprint book. The elements shown in blue are the new ones added to the use case from part 1.

 000_use-case

The Database Adapter will be configured to listen on the PERSON_CHG_T helper table for new records. This table is filled by a trigger on the PERSON_T table and will hold one row for every change to the PERSON_T table.
For each new row in PERSON_CHG_T I want an new OSB service to be called. This new service will use the data from the inbound request and enrich it by re-using the PersonService proxy service we have built in Part 1.

Prerequisites

The prerequisites for the 2nd part are obviously the same as in Part 1. The following software needs to be installed and available:

  • JDeveloper 11g with SOA extension
  • Eclipse 3.5.2 with Oracle Enterprise Pack for Eclipse (OEPE) 11.1.1.5.0
  • Oracle Service Bus 11.1.1.3
  • Oracle Database (XE is good enough)

Additionally you need the completed OSB project from Part 1. The solution can be downloaded from here.

Project Setup

The project setup has been done in Part 1. We will reuse the same Eclipse OSB project with the nested JDeveloper SOA Project and just continue where we have left in Part 1.

Create the Inbound Database Adapter

First let’s create a new Database Adapter.

For that we don’t need a new JDeveloper project, we can reuse the same project we created in Part 1, wrapped inside the adapter folder. I think it’s a good practice to keep all the adapters necessary for one OSB project in only one JDeveloper project.

Let’s go to JDeveloper an open the composite.xml to show the SCA composite view.

  1. Drag a new Database Adapter into the SCA composite. Because it’s an Inbound Adapter, we will use the left hand swimmlane named “Exposed Services” for that. This is not strictly necessary when using the OSB but I think it’s a good mnemonic trick to do so (organizing inbound adapters on the left and outbound adapters on the right, as discussed in part 1).
    010_drag-db-adapter
  2. Give the adapter service a good and meaningful name:
    015_adapter-wizard-2of4
  3. For the connection we reuse the settings already their from part 1, so we can move forward to the the Operation Type selection. This time we want to use the Database Adapter to “Poll for New or Changed Records in a Table”.
    020_adapter-wizard-4of5
  4. We want to poll the PERSON_CHG_T table, so let’s select it. 025_adapter-wizard-5of12
  5. We can see that the table only holds an ID and a timestamp. So that’s all we get in the inbound message, whenever a row is inserted into PERSON_CHG_T. This is the reason why we later want to enrich the message with more information in a second step.
     030_adapter-wizard-7of12
  6. Next we need to define the strategy to use for signaling that a row has been read and successfully processed by the adapter. Because PERSON_CHG_T is a helper table no one else is using, it’s fine to just delete the row.
     035_adapter-wizard-8of12
  7. Next the Polling Options can be specified. Among others you can specify the polling frequency, which is set to 5 seconds by default, meaning that the Adapter will do the SQL operation shown on the right every 5 seconds. For our sample that’s fine, but in real world you should of course set it to a value matching your requirements.
    045_adapter-wizard-9of12
  8. Last but not least the Database Adapter allows for setting a selection criteria. We don’t use it this time, as we want to read all the rows which are added to the PERSON_CHG_T. 050_adapter-wizard-10of12

This finishes the creation of the Inbound Database Adapter and our work in JDeveloper. We can see the Adapter on the right hand swimmlane.

 055_composite-with-new-adapter

The adapter is now prepared to poll the PERSON_CHG_T table for new records every 5 seconds. Each row being read will be send to the service linked to the adapter. So let’s switch to the OSB project in Eclipse and create a new service to handle these messages.

Creating the OSB Service and linking it to the Inbound Database Adapter

When working with Inbound Adapters, an OSB proxy service needs to be used. The adapter will invoke the proxy service whenever a new message “is created” by the adapter.

  1. In order to be able to create/generate the proxy service, we need the new adapter artifacts in Eclipse. Just do a refresh on the adapter folder and they will show up. 060_osb-project-refresh
  2. No we can choose Generate Service on the JCA configuration file (PollingPersonService_db.jca) to create the necessary OSB service.
    065_generate-proxy-service-for-jca
  3. Based on the JCA settings, OSB knows that it is an Inbound Adapter and will generate a JCA Proxy Service automatically. All we need to specify is the right folder: 070_name-proxy-service
  4. The proxy settings, created for you, show that a WSDL is used which has been generated as well:
    075_proxy-general-tab
  5. The transport setting show the usage of the JCA Transport:
    080_proxy-transport-tab
  6. All we need to do is specify what should happen with the message, by defining a meaningful Message Flow. For a start add a Pipeline Pair Node with a nested Stage Node and a Log action to show the message in the OSB log on the console. Make sure to specify a Severity level in the Log action which is shown in the log. If you are unsure what to choose, then “Error” will be fine for that sample and shown by default.
    085_adding-log-to-proxy
  7. Now let’s deploy the OSB project and test if the Inbound Adapter works. For that let’s open SQL PLus, connect to SOA_SAMPLE and do an UPDATE on the PERSON_T table. By that the trigger on that table will fire and signal the change by adding a row to the PERSON_CHG_T table. Make sure to commit the change! 090_testing-with-sql-plus-chg
  8. After a maximum of 5 seconds (remember the polling frequency specified in the Database Adapter wizard) the log should show up on the OSB console window. 095_testing-with-sql-plus-chg-2

 

We can see that the polling Database Adapter worked. A message has been sent to the OSB proxy service holding the ID of the changed PERSON_T row and a timestamp!

In a real world scenario you would now want to do something more meaningful with this information than just logging it to the console, i.e. you want to inform another system about the change. In order to do that, you might need to send more information than just the ID of the person. The system to inform maybe require the person information, similar to the information returned by the PersonService we developed in Part 1. So let’s reuse that proxy service to enrich our message, implementing the Content Enricher design pattern.

Adding the Content Enricher

To enrich our message, we want to call the PersonService proxy service from the Message Flow of the PollingPersonServiceDB proxy service.

  1. Let’s add a Service Callout action and rename the stage to EnrichmentStage. It’s always a good idea to meaningfully name the different nodes used to structure the message flow. This helps you to better understand and document your message flow at development time but also helps in case of errors at runtime, to easier identify the place where the error occurred.
     100_adding-service-callout
  2. Configure the Service Callout action to call the PersonService proxy and to invoke the findPerson operation. For the request and response message we define two variables and specify to use a Soap Body. The Service Callout action allows to use separate variables for the request and response message. By that the content of the $body variable from the request to the proxy service stays untouched during the service callout. This is important if you want to merge the response from the service callout with the original request. This is not necessary in our simple example, all we will use is the response directly from the service callout. But usually you will need to merge the two when implementing the Content Enricher pattern in OSB.
     105_configure-service-callout
  3. Next we implement the Assign action to set the requestBody variable. 
     112_add-assign-for-request-2
  4. Because we specified “Configure Soap Body” in the Service Callout properties, we need to setup the <findPersonRequest> message wrapped in a <soap-env:Body> element. The value of the <personId> element can be retrieved from the $body variable by dragging it into the Expression view and defining the XPath expression shown in the image below
    115_configure-assign-for-request

     
    117_configure-assign-for-request-2

  5. Last but not least you need to add v1 as a custom namespace:
    120_add-namespace
  6. In the Response Action of the Service Callout we will also use an Assign action, this time to copy the value of the $responseBody variable to the $body variable.
     130_configure-assign-for-response
  7. Let’s change the Annotation of the Log action from before to state the fact that we now log the content of the $body variable after the service callout has been made.
    135_change-log-action
  8. Let’s test it in the same way as before. Just re-execute the UPDATE on PERSON_T and this time a longer log message with a complete Person instance should be shown.
    140_testing-with-sql-plus-2

 

The Content Enrichment worked an the complete and up-to-date person information could now be sent to any system interested.

Conclusion

This finishes the 2nd part of this blog article series.

We have added an Inbound Adapter to the use case to get informed whenever the information changes in the PERSON_T table. By re-using the PersonService from a Service Callout in the Message Flow we were able to enrich the incoming message to a more meaningful “change message”, which could now be used to inform potential external systems of changes happening on the PERSON_T table.

We have used the OSB to implement parts of a typical integration scenario. Similar to one of the scenarios documented in our Integration Blueprint book!

The implementation of a dynamic publish-subscribe mechanism on the OSB, in order to inform the systems interested could be a topic of a next blog article.

The source code for the solution can be downloaded from here. I will again provide a video showing how this extension of the use case has been developed.

Oracle Service Bus 11g and DB Adapter: a more integrated approach!

Update 9.8.2010: Just uploaded a video showing how the use case described in this blog has been developed.
Update 15.8.2010: Part II: Using an Inbound Database Adapter has been published today.

The JCA adapter framework we know from SOA Suite is supported by the Oracle Service Bus (OSB) since 10.3.1. The Database Adapter fills one gap of the Oracle Service Bus: there is no OSB transport for accessing a database and accessing the database was previously only possible from an XPath function in read-only mode.

Many blog articles have already been published about using the JCA adapters with Oracle Service Bus. There are two good blog articles from Edwin Biemond and from James Taylor about how to use the Database Adapter with Oracle 10g and 11g. Additionally the Oracle Service Bus Samples page holds a viewlet that demonstrates the usage of the DB adapter with 10.3.1. So why another blog article?

First the Database Adapter is a feature, which deserves many blog articles and second when I went through the samples mentioned above, I’ve found a way to better integrate the definition of the JCA adapter with the OSB proxy service and business service development, which makes the handling much easier.

One of the difficulties when using the JCA adapter framework with the OSB is the two different IDE’s being necessary. The adapter wizards are only available in JDeveloper and therefore for the definition of the adapters JDeveloper needs to be used. After that only the artifacts generated by the adapter wizard (WSDL, XSD, JCA config, toplink mappings, ..) are necessary.

The approaches described by the sources mentioned above show how to create a JDeveloper project first, create the adapters and then copy the necessary files into the OSB projects. What I don’t like about that is the copying of the resources. Of course this can be automated, but when you have to go back an forth between the adapters and the OSB project during development, because you need to change the settings of the adapters multiple times, its just a matter of time until you for once work with an non-actualized version of some files. So how can we avoid that?

Of course we can not change the fact that we have to work with Eclipse and JDeveloper in parallel, until Oracle has moved the whole OSB development environment to JDeveloper, probably with 11R2.

The approach I present here is actually quite simple. Instead of having to separate projects, I just create the JDeveloper project embedded inside the OSB project in a special folder (adapter) as shown in the image below.

000_nesting_of_projects

By that, all the adapter for one OSB project can be placed in one single JDeveloper SOA project and by that all the generated artifacts are always local to the OSB project. This way they can be used to generate a proxy or a business service directly.

If an adapter needs to be changed, then a refresh on the adapter folder is good enough to pick up the new version of the adapter files. No more copying of files between the two projects is necessary!

Some of the older sample available on the Web show how to use the OSB console to import the artifacts generated by the adapter. This is also no longer necessary! Everything can be done directly in Eclipse in 11g.

Use Case

The use case I will demonstrate is rather simple. The idea is to make the data of 3 tables in an Oracle DB accessible as a web service in a contract-first approach.

I will use the Database Adapter to access the data, wrap it by a business service and use a proxy service with two XQuery transformation to publish it as a SOAP WebService with its own WSDL and XSD. By that the data is immediately available to any SOAP WebService consumer. I will use SoapUI to demonstrate that.

The scenario is shown in the image below. I’m using the notation from our Integration Blueprint book.

000_use-case

Prerequisites

In order to follow the tutorial below, the following software has to be available:

  • JDeveloper 11g with SOA extension
  • Eclipse 3.5.2 with Oracle Enterprise Pack for Eclipse (OEPE) 11.1.1.5.0
  • Oracle Service Bus 11.1.1.3
  • Oracle Database (XE is good enough)

On the Oracle Database you have to install the SOA_SAMPLE schema available in download here. Just execute the cr_obj.sql located inside the database folder.

The Web Service interface (WSDL and XSD) to be published by the proxy service are available in the misc folder in the download. If you follow the tutorial then it’s assumed that this two files are available in c:\temp.

Project setup

Let’s first create the Oracle Service Bus project and inside in the adapter folder the nested JDeveloper SOA project

  1. First create a new Oracle Service Bus Project and create a folder structure for the different artifacts created later.001_initial-osb-project-with-folder
  2. The adapter folder is the place where we will embed the JDeveloper project. Check and copy the name of the folder to be used when creating the JDeveloper SOA project.
    005_adapter-folder-to-create-jdev-in
  3. Now let’s switch to JDeveloper for a while and create the new SOA Project (inside the adapter folder of the OSB project), which will define and hold the adapter artifacts.
    010_create-jdev-soa-project-in-osb-project 
  4. Choose Empty Composite for the Project template. We will only use the SCA composite to place a Database Adapter and we won’t use any of the components like BPEL or Mediator.
    015_create-empty-composite
  5. The empty composite window shows up. You can think of the Components section as the place where your OSB proxy and business services are located, although that’s not true before probably 11gR2. Using that mnemonic trick you can place the adapters in the same way as you are used from SOA Suite. Inbound adapters (file polling, database polling, de-queuing) should be placed on the Exposed Services swimmlane and outbound adapters (file write, database read/write, enqueueing, …) should be placed on the External References swimmlane.
    020_composite-overview 

All the JCA adapters used by one OSB project can be defined in the same SOA project.

Create the Database Adapter

With the project setup in place, let’s now configure the Database Adapter by going through the adapter wizard.

  1. We will need an outbound Database Adapter, so we drag it to the right hand side swimmlane.

    025_drag-database-adapter-to-external-references
     
  2. Give the adapter service a meaningful name
    030_db-adapter-wizard-step2-4
  3. Create a connection to be used only at development time during the wizard and specify a JNDI Name to be used to retrieve the database connection factory at runtime. The Connection Factory object need to be setup on WebLogic before deploying the OSB project.
    032_db-adapter-wizard-step2-3
  4. For the Operation we choose Select, we only want to read from the database.
    034_db-adapter-wizard-step4-5
  5. In the next step the tables to SELECT from are specified. We want to read from PERSON_T, ADDRESS_T and COUNTRY_T all together; the PERSON_T should be the root table to start the query from.
    036_db-adapter-wizard-step5-10
  6. The next step allows for creating the necessary relationships between the tables. PERSON_T has a 1:m relationship to ADDRESS_T which has a 1.1 relationship to COUNTRY_T
    038_db-adapter-wizard-step6-10
  7. In the Attribute Filtering step all the attributes returned from the tables are shown and you can uncheck the attributes you don’t want the service to return. Here we specify that we don’t want to return the ISO Country number. You can also see that the hierarchical structure resembles the relationships defined above.
    040_db-adapter-wizard-step7-10
  8. In the next step we define the restriction to be applied by the service. By default all the rows in PERSON_T would be returned. Our service should only return a given person defined by it’s primary key. So we define a personId parameter and add it in a WHERE clause.
    042_db-adapter-wizard-step8-10
  9. By that the adapter is defined an we can click finish on the next page. The adapter wizard now generates the necessary artifacts like WSDL, XSD, JCA configuration and toplink mapping files.
    045_composite-with-adpter

This finishes the work in JDeveloper. Let’s switch back to Eclipse and the OSB project created before.

Create the Business Service

In order to use the Database Adapter from OSB, we either need a business or proxy services configured to use JCA transport. For outbound adapters, a business service is necessary, whereas for inbound adapters, a proxy service is used.

  1. First let’s make the artifacts from the JDeveloper project visible in the OSB project by doing a refresh on the adapter folder. We can see the structure of our SOA project nested in the OSB project.
    050_refresh-project-in-osb-project            055_osb-project-after-refresh
  2. Now let’s create the business service, which will wrap our outbound Database Adapter defined above. We can do that directly in Eclipse, there is no longer a need to use the OSB console for that. Just right-click on the jca configuration file (RetrievePersonService_db.jdca), select Oracle Service Bus | Generate Service and specify the name and the folder where the business service and the WSDL should be created (folder business-service).060_generate-business-service      062_generate-business-service
  3. The transport configuration is automatically done for us, nothing needs to be done here:065_busienss-service-with-jca-transport

By that the business service is created and ready to be used. It could already be tested from the OSB console, but we want the service to be reachable from outside via a SOAP Web Service. So let’s create the proxy services doing exactly that.

Create the Proxy Service

When creating the proxy service it’s good to first think about the service interface it should provide. A SOAP based WebService interface is what we want, but what format do we use? Can’t we just use the WSDL generated by the Database Adapter also for the proxy service? It’s so easy, isn’t it?

It would be possible, but by that, we would expose information from the database to the outside and by that create a much stronger coupling between the service consumer and the database than necessary. We would use a contract-last approach, where the contract is just generated based on some artifacts already available! A change on the database (table name, column name, data type) would almost for sure have an impact on the interface, something we definitely want to avoid when using the service in a larger context in a Service-Oriented Architecture (SOA).

What we want to use is a contract-first approach, where we can independently define the service contract first. Fortunately that’s well supported by OSB and easily achieved by defining a new WSDL, using it when defining the proxy service and creating two transformation operation to translate to/from the new format.

The WSDL and XSD forming the service contract PersonService is available in the download. It uses a canonical format of a person and its addresses which is somehow different to the format used on the database and independent of any backend service.

  1. First import the WSDL and XSD files into the wsdl folder of the OSB project
    067-copy-wsdl-and-xsd-into-osb-project
  2. Now let’s create the proxy service 
     070_create-proxy-service
  3. Select the WSDL PersonService-1.0.wsdl for the interface of the proxy service
    075_create-wsdl-based-proxy-service
  4. Add a Route Node with a Routing action to the empty message flow of the proxy service
    077_create-route-node 
  5. Select the business service just created as the service to call by the route action
    080_select-service-for-proxy

Create the two transformations

For the transformation of the request and of the response we need one XQuery transformation each.

  1. First we create the XQuery transformation for the request, which is very easy, all we need to map is the personId query parameter. With the graphical mapper feature provided by the OSB Eclipse plugin it’s even easier!
    090-data-transformation-request
  2. Next we create the XQuery transformation of the response. This is a bit harder, as more items need to be mapped, but with the build in graphical mapper it’s again not a lot of work!
    092-data-transformation-response

Add transformation to the message flow of the proxy service

Now the only thing left to do is inserting the two transformations into the message flow.

  1. First we use the request transformation in the Request Action of the Routing action. By using a Replace action the already existing body with the <soap-env:Body> tag is reused and only the content is replaced by the result of the XQuery.
    095_use-request-data-transformation-with-replace
  2. The response is handled similar to the request by another Replace action
    100_use-response-data-transformation-with-replace
  3. In the parameter binding to the XQuery we manually have to specify the PersonTCollection element which holds the response from the DB adapter.
    101_select-xquery-for-response
  4. Additionally we also have to add a user-defined namespace
    102_custom-namespace

Create DataSource and Connection Factory in WebLogic

Before we can deploy and use the OSB service we need to create the necessary objects in WebLogic.

  1. First we create the DataSource object with the JNDI alias jdbc/SoaSampleDataSource109_data-source

  2. Create the DB Adapter Connection Factory object 110_create-connection-factory
  3. and configure the DataSource jdbc/soaSampleDataSource created above
    112_define-data-source-in-connection-factory

Don’t forget to update the DB Adapter in order to activate the configuration changes.

Now it’s time to deploy and test the OSB service.

Deployment and Testing with soapUI

Deploy the OSB service to the OSB server and then start soapUI.

  1. Create a new soapUI project and specify the WSDL the proxy service on the OSB exposes. On my machine this is  http://localhost:7001/DbAdapterOSBProject/PersonService?wsdl.
    120_create-soap-ui-project
  2. Now let’s use the generated request and send a message with personId = 1.
    125_test-request-with-soapui
  3. We should get a successful response like the one shown in the image below. This is the information from the database in the canonical format.127_test-request-with-soapui-2

Conclusion

This finishes the tutorial of using the Database Adapter with the Oracle Service Bus.

I hope I was able to show how easy it is to integrate the JCA adapter framework with Oracle Service Bus 11g. Although there are two IDE’s involved, the strategy of embedding the JDeveloper SOA project inside the OSB project helps in keeping the OSB project in sync with the SOA project. By that it’s much easier to maintain the adapter, just change the settings by restarting the adapter wizard and after refreshing the OSB project everything is in sync.

In a next blog article I will show how to use the Database Adapter in an inbound scenario, where the adapter will trigger an OSB proxy service whenever a new row is added to the database.

The source code with the implementation of this use case can downloaded from here.

Using the event API to publish an event to the Event Delivery Network (EDN) – the Spring way

The Event Delivery Network (EDN) in Oracle SOA Suite 11g provides a declarative way to use a publish/subscribe model to generate and consume business events without worrying about the underlying message infrastructure. Events can be published / subscribed from a variety of programming environments such as Java, PL/SQL, SOA Composites, and ADF-BC applications

In his blog Clemens showed how the Event API can be used to publish an event programmatically to the Event Delivery Network (EDN). I took the sample code Clemens provided and did some refactoring to make it more “Spring like”.

My idea is to use some of the features of the Spring Framework to simplify the publishing of Business Events to EDN from Java, without worrying about the details of the Java event API. The following features are of interest:

    The goal is to make publishing an Business Event to EDN from Java as simple as that:
// create a new customer
Customer cust = new Customer();
cust.setId(1);
cust.setFirstName("Guido");
cust.setLastName("Schmutz");

Address adr = new Address();
adr.setStreet("Papiermuehlestrasse 73");
adr.setZipCode(3014);
adr.setCity("Bern");
adr.setCountry("Switzerland");
cust.setAddress(adr);

// create a new customer event, passing the customer object as the payload
NewCustomerEvent event = new NewCustomerEvent(this, cust);

// publish the event through the Spring Event Handling mechanism
context.publishEvent(event);

    We just use simple Java objects (POJOs) to create an event and its payload and then publish it through the Spring application context as a standard Spring event.
    Sounds interesting to you? Yes, but what is needed behind the scene to make this work?
    Before we start to dig into the Java implementation, let’s first create the Business Event inside the Oracle SOA Suite 11g and implement a composite subscribing to this event.

    First, you need to define the data-shape of the event. This is done conventionally through the usage of XML Schema (File customer.xsd).

    <?xml version="1.0" encoding="windows-1252" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
                xmlns:cus="http://www.trivadis.com/cdm/customer"
                targetNamespace="http://www.trivadis.com/cdm/customer"
                elementFormDefault="qualified">
      <xsd:element name="customer">
        <xsd:complexType>
          <xsd:sequence>
            <xsd:element name="id" type="xsd:integer"/>
            <xsd:element name="firstName" type="xsd:string"/>
            <xsd:element name="lastName" type="xsd:string"/>
            <xsd:element name="address">
              <xsd:complexType>
                <xsd:sequence>
                  <xsd:element name="street" type="xsd:string"/>
                  <xsd:element name="city" type="xsd:string"/>
                  <xsd:element name="zipCode" type="xsd:integer"/>
                  <xsd:element name="country" type="xsd:string"/>
                </xsd:sequence>
              </xsd:complexType>
            </xsd:element>
          </xsd:sequence>
        </xsd:complexType>
      </xsd:element>
    </xsd:schema>
    
    

     

    After that the Business Event can be defined using the Event Definition Language (EDL).

    <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <definitions xmlns="http://schemas.oracle.com/events/edl"
                 targetNamespace="http://www.trivadis.com/events/edl/CustomerEventDefinition">
      <schema-import namespace="http://www.trivadis.com/cdm/customer"
                     location="xsd/customer.xsd"/>
      <event-definition name="NewCustomer">
        <content xmlns:ns0="http://www.trivadis.com/cdm/customer"
                 element="ns0:customer"/>
      </event-definition>
    </definitions>
    

     

    At last we can publish/subscribe to that event from a SOA Suite composite. For our example, I have defined a simple Mediator component listening on the NewCustomer event, which is then sent to a BPEL component for processing.

    tmp

       

      After deploying this SCA composite to the server, we are ready on the SOA Suite side to consume NewCustomer events. Now let’s implement the producer side in Java.

      Defining the Event in Java


      We start with the payload (the information) of the event. As shown in the Java snippet above, the goal is to pass the event payload as a normal Java object, without having to worry about the XML representation necessary for publishing the event through the API.

      The act of converting an XML document to/from an object is called Object/XML Mapping or O/X Mapping for short. There are quite a lot of Java frameworks supporting this Object/XML Mapping, like JAXB, Castor, XMLBean, JiBX or XStream. For this post I will use the JiBX framework, as it offers a flexible approach when starting with an XML Schema.

      In my previous post Using the new Object/XML Mapping Support of Spring 3.0 with JiBX and Maven I have presented how to work with JiBX from Spring using Maven as the build tool. The same approach I will use here as well.

      With the XML Schema customer.xsd used for the BusinessEvent definition above as input, the JiBX framework generates the two Java classes Customer and Address defining the payload of our event in Java.

      public class Customer
      {
          private int id;
          private String firstName;
          private String lastName;
          private Address address;
      
          // public getter and setter method not shown
      }
      
      public class Address
      {
          private String street;
          private String city;
          private int zipCode;
          private String country;
      
          // public getter and setter method not shown
      }
      
      

      Next let’s define the NewCustomerEvent class, which defines the event itself and acts as a wrapper for the event payload.

      public class NewCustomerEvent extends AbstractApplicationEventForEDN {
      
      	private Customer customer;
      	private final static String NAMESPACE = "http://www.trivadis.com/events/edl/CustomerEventDefinition";
      	private final static String NAME = "NewCustomer";
      
      	
      	public NewCustomerEvent(Object source, Customer customer) 
                                throws XmlMappingException, ParserConfigurationException, IOException {
      		super(source, NAMESPACE, NAME, customer);
      	}
      }
      
      

      The payload, customer in the case here, can be passed in the 2nd argument of the constructor. The other properties necessary for a Business Event in EDN (namespace and name) are defined as constant values, passed to the constructor of the parent class.
      NewCustomerEvent inherits from AbstractApplicationEventForEDN, the base class for all events to be published to EDN. It defines the common properties of an Business Event, like namespace, local name, content, event id and conversation id.

      public abstract class AbstractApplicationEventForEDN extends ApplicationEvent {
      
      	private String namespace;
      	private String name;
      	private Object content;
      	private Object eventId;
      	private Object conversationId;
      
      	public AbstractApplicationEventForEDN(Object source, String namespace, String name, Object content) throws ParserConfigurationException, XmlMappingException, IOException {
      		super(source);
      		
      		this.namespace = namespace;
      		this.name = name;
      		this.content = content;
      		this.eventId = UUID.randomUUID();
      		this.conversationId = UUID.randomUUID();
      	}
      
      	// setter method not shown
      }
      
      

      AbstractApplicationEventForEDN itself inherits from ApplicationEvent, a class provided by the Spring event handling mechanism. This means that such a Java Business Event is automatically a Spring ApplicationEvent so that it can be published in the Spring event handling mechanism like any other Spring event.

      So far we have a Java representation of the Business Event with its payload and we are able to publish these events through Spring.
      Next we will see how we can connect to EDN and publish the Business Events in a Spring way.

      Creating a FactoryBean to abstract away the creation of a BusinessEventConnection

      First let’s start with the link from Java to the Event Delivery Network. To talk to EDN, we need an instance of oracle.fabric.blocks.event.BusinessEventConnection. To create such an instance, the EDN Java API provides a factory interface named BusinessEventConnectionFactory together with an implementation through SAQRemoteBusinessEventConnectionFactory.
      I use such a FactoryBean to wrap the creation of a BusinessEventConnection.

      public class BusinessEventConnectionFactoryBean implements FactoryBean, InitializingBean {
      
      	DataSource dataSource;
      	BusinessEventConnectionFactory businessEventConnectionFactory;
      	
      	public void setDataSource(DataSource dataSource) {
      		this.dataSource = dataSource;
      	}
      
      	public void afterPropertiesSet() throws Exception {
              businessEventConnectionFactory = 
                  new SAQRemoteBusinessEventConnectionFactory(
                      dataSource, dataSource, null);
      	}
      
      	public Object getObject() throws Exception {
      		return businessEventConnectionFactory.createBusinessEventConnection();
      	}
      
      	public Class getObjectType() {
      		return BusinessEventConnection.class;
      	}
      
      	public boolean isSingleton() {
      		return false;
      	}
      }
      
      
      BusinessEventConnectionFactoryBean  declares a dependency to a DataSource, which is necessary for creating a BusinessEventConnectionFactory. This means that the DataSource needs to be injected through Spring, so a bean configuration is necessary in the Spring context XML definition
    <context:property-placeholder location="classpath:jdbc.properties"/>
    
    <bean id="businessEventConnection" class="com.trivadis.soa.BusinessEventConnectionFactoryBean">
    	<property name="dataSource" ref="dataSource"/>
    </bean>
    	
    <bean id="dataSource" class="oracle.jdbc.xa.client.OracleXADataSource">
    	<property name="user" value="${jdbc.username}"/>
    	<property name="password" value="${jdbc.password}"/>
    	<property name="URL" value="${jdbc.url}"/>
    	<property name="implicitCachingEnabled" value="false"/>
    </bean>
    

     

    The DB connection properties can be externalized into a properties file (jdbc.properties) by using the Spring context:property-placeholder element.

    jdbc.url=jdbc:oracle:thin:@soavm11.trivadis.com:1521:repoas
    jdbc.username=dev_soainfra
    jdbc.password=oracle
    

     

    When running in an Java EE application server, the DataSource can easily be retrieved via JNDI by switching the bean configuration as follows

    <bean id="dataSource" class="org.springframework.jndi.JndiObjectFactoryBean">
        <property name="jndiName" value="jdbc/MyDataSource"/>
    </bean>
    
    

     

    With the BusinessEventConnection bean set up, let’s see how the Java event classes (i.e. NewCustomerEvent) can be published to EDN.

       

      Publishing the Event to the Event Delivery Network (EDN)

    According to Spring Best Practices we first define an interface which the event publisher implementation will need to implement. The publishEvent() method declares a parameter of type AbstractApplicationEventForEDN, so any child class extending it will be accepted and published.

    public interface BusinessEventPublisher {
    
    	public abstract void publishEvent(AbstractApplicationEventForEDN applicationEvent)
    			throws XmlMappingException, IOException;
    
    }
    

     

    My implementation of the BusinessEventPublisher can be seen below with the BusinessEventPublisherEDN class. The publishEvent() method first marshals the payload (content property) of the AbstractApplicationEventForEDN into an XML document using the Spring O/X Mapping support. The Marshaller implementation to be used is injected by Spring at initialization time (for more information see my other blog post).

    After that an instance of oracle.fabric.common.BusinessEvent is created and published via the BusinessEventConnection, which is also injected by Spring (i.e. the bean declared before).

    public class BusinessEventPublisherEDN implements BusinessEventPublisher {
    
    	private BusinessEventConnection conn;
    	private Marshaller marshaller;
    
    	public void setBusinessEventConnection(BusinessEventConnection conn) {
    		this.conn = conn;
    	}
    
    	public void setMarshaller(Marshaller marshaller) {
    		this.marshaller = marshaller;
    	}
    
    	public void publishEvent(AbstractApplicationEventForEDN applicationEvent) throws XmlMappingException, IOException {
    		DocumentBuilderFactory documentBuilderFactory = DocumentBuilderFactory
    				.newInstance();
    		DocumentBuilder builder = null;
    		try {
    			builder = documentBuilderFactory.newDocumentBuilder();
    		} catch (ParserConfigurationException e) {
    			e.printStackTrace();
    		}
    		Document document = builder.newDocument();
    		DOMResult result = new DOMResult(document);
    
    		marshaller.marshal(applicationEvent.getContent(), result);
    		Element payload = document.getDocumentElement();
    
    		BusinessEvent event = buildEvent(applicationEvent, payload);
    		conn.publishEvent(event, 3);
    	}
    
    	private BusinessEvent buildEvent(
    			AbstractApplicationEventForEDN applicationEvent, Element payload) {
    		BusinessEventBuilder beb = BusinessEventBuilder.newInstance();
    		QName eventName = new QName(applicationEvent.getNamespace(), applicationEvent
    				.getName());
    		beb.setEventName(eventName);
    		beb.setBody(payload);
    		beb.setProperty(BusinessEvent.EVENT_ID, applicationEvent.getEventId());
    		beb.setProperty(BusinessEvent.PROPERTY_CONVERSATION_ID, applicationEvent
    				.getConversationId());
    		beb.setProperty(BusinessEvent.PRIORITY, 1);
    		BusinessEvent be = beb.createEvent();
    		return be;
    	}
    }
    
    

     

    I could use this EventPublisher implementation directly to publish events by injecting it all the event producer beans.

    But my idea is to use the Spring event handling mechanism as the base event transport inside the Spring application and to have one common/central place where all Spring event of type AbstractApplicationEventForEDN are forwarded to Oracle EDN.

    For that I have implemented the Spring ApplicationListener using the AbstractApplicationEventForEDN base class as the type parameter. This has the effect that the listener does only subscribe to subclasses of AbstractApplicationEventForEDN, i.e. all my Java EDN events. The listener uses the injected BusinessEventPublisher instance to publish the events to EDN. This listener implementation basically acts as a bridge from the Spring event handling mechanism to the Event Delivery Network.

    public class ApplicationEventListenerEDN implements ApplicationListener<AbstractApplicationEventForEDN> {
    
    	private BusinessEventPublisher businessEventPublisher;
    	
    	public void setBusinessEventPublisher(
    			BusinessEventPublisher businessEventPublisher) {
    		this.businessEventPublisher = businessEventPublisher;
    	}
    
    	public void onApplicationEvent(AbstractApplicationEventForEDN applicationEvent) {
    		try {
    			businessEventPublisher.publishEvent(applicationEvent);
    		} catch (XmlMappingException e) {
    			e.printStackTrace();
    		} catch (IOException e) {
    			e.printStackTrace();
    		}
    	}
    }
    

    The configuration of this classes as Spring beans and the dependencies between them is shown here:

    <bean id="applicationEventListener" class="com.trivadis.soa.ApplicationEventListenerEDN">
    	<property name="businessEventPublisher" ref="businessEventPublisher"/>
    </bean>
    
    <bean id="businessEventPublisher" class="com.trivadis.soa.BusinessEventPublisherEDN">
    	<property name="businessEventConnection" ref="businessEventConnection"/>
    	<property name="marshaller" ref="marshaller"/>
    </bean>
    
    <oxm:jibx-marshaller id="marshaller" target-class="com.trivadis.cdm.product.Product"/>
    

     

    That’s it! The Java event producer implementation is ready to be tested.

       

      Testing the publishing of an Event


      I have used the Spring Integration Testing support to test the event publishing. The test class PublisherTest implements the Spring ApplicationContextAware interface. This will force the class to implement the setApplicationContext() method, so that the Spring ApplicationContext will be injected at runtime. This ApplicationContext is necessary to publish an event to the Spring event handling mechanism.

      @RunWith(SpringJUnit4ClassRunner.class)
      //ApplicationContext will be loaded from "classpath:/com/trivadis/soa/PublisherTest-context.xml"
      @ContextConfiguration
      public class PublisherTest implements ApplicationContextAware  {
      	
      	private ApplicationContext context;
      
      	public void setApplicationContext(ApplicationContext context) throws BeansException {
      		this.context = context;
      	}
      
      	@Test
      	public void testPublishNewCustomer() throws Exception {
      		Customer cust = new Customer();
      		cust.setId(1);
      		cust.setFirstName("Guido");
      		cust.setLastName("Schmutz");
      		Address adr = new Address();
      		adr.setStreet("Papiermuehlestrasse 73");
      		adr.setZipCode(3014);
      		adr.setCity("Bern");
      		adr.setCountry("Switzerland");
      		cust.setAddress(adr);
      			
      		NewCustomerEvent event = new NewCustomerEvent(this, cust);
      		context.publishEvent(event);
      	}
      }
      

      The test method uses the code shown at the beginning of this post. So the goal of simplifying the event publishing to EDN from Java has been achieved!

      The Enterprise Manager console proves that the event publishing to EDN works!

      temp[7]

    Above the Flow Trace overview and below the details for the EventConsumer is shown. 

        temp[9]

         

        Source code

         

        The code shown in this post is only a proof-of-concept and before using it in a real project, some things like error handling need to be improved. All the source code can be downloaded from here.

        In addition to the NewCustomerEvent shown above, the downloadable version implements two other events, UpdateCustomerEvent and NewProductEvent.

      The download contains he following projects:

        Project IDE/Build Description
        spring-edn Eclipse/Maven all reusable classes described in my post
        spring-edn-sample Eclipse/Maven simulates a Spring application with the Business Event class definition. Depends on the spring-edn project and references these classes when setting up the Spring context at startup time.
        maven-local-repo Maven Handles JAR’s not available in public Maven Repositories but necessary for the build of the two projects. There is a script which will install these JARs into the local repository on your machine. Except of the jms.jar, the JAR files are not provided here (because I don’t know if I would be allowed to), they are copied from your own local installation when running the script.
        EDNProject JDeveloper SOA Suite 11g project defining the Business Event for the Event Delivery Network (EDN) and the composite subscribing to the event on EDN.
        In order for the JiBX customization to work when building the spring-edn-sample project, a patched version of the JiBX maven plugin (1.2.1.1a) is necessary, downloadable from here. For more information see my previous post. For the deployment of the projects, follow these steps:
      1. deploy the EDNProject to a SOA Suite 11g instance.
      2. install the missing JAR’s into your local maven repository by running maven-local-repo/install-jars.bat. Don’t forget to change the environment variable to point to your local installation of FMW
      3. install the necessary artifacts to your local maven repository
      4. create the spring-edn project using mvn install
      5. create the spring-edn-sample project using mvn install.

      Now you can test the installation by running the unit test class as shown above! Hope your are successful!

      Solving “javax.naming.NameNotFoundException: Unable to resolve ‘jdbc.SOAAppUserDataSource’ “ with Oracle SOA Suite 11g Adapter Services

      I ran into this rather “stupid user error” a couple of times already. It’s very easy to solve, however it’s also very easy to do it wrong when configuring Oracle SOA Suite 11g Adapter services through the WebLogic Console.

      I hope this blog entry will help new users when running into this problem:

      You get the following error in the SOA Server Log when testing your Adapter service:

      Exception occured when binding was invoked.
      Exception occured during invocation of JCA binding: “JCA Binding execute of Reference operation ‘insert’ failed due to: Could not create/access the TopLink Session.
      This session is used to connect to the datastore.
      Caused by Exception [TOPLINK-7060] (Oracle TopLink – 11g Release 1 (11.1.1.2.0) (Build 091016)): oracle.toplink.exceptions.ValidationException
      Exception Description: Cannot acquire data source [jdbc/SOAAppUserDataSource].
      Internal Exception: javax.naming.NameNotFoundException: Unable to resolve ‘jdbc.SOAAppUserDataSource’. Resolved ‘jdbc’; remaining name ‘SOAAppUserDataSource’.

      The invoked JCA adapter raised a resource exception.

       

      The setup of the Adapter Connection Factory configuration

      tmp

      as well as the corresponding DataSource (i.e. jdbc/SoaAppUserDataSource) seems to be correct at first sight.

      tmp[13] 

      But you might have forgotten to select the server, on which the JDBC data source should be deployed when adding the DataSource in the first place. This is possible when clicking the Finish button too early, before the last step which would ask to select a target server!

      The missing target information for the SoaAppUserDataSource is clearly shown in the summary page of the JDBC Data Sources.

      tmp[9]

      To fix that, just edit the Data Source, click on the Targets tab and select the servers or cluster it should be deployed on (i.e. soa_server1 in my case):

      tmp[11]

      After that the error should disappear!

      I think it would be good to get an error/warning on the console when trying to add a DataSource without selecting a target server!

      Running an Oracle SOA Suite 11g Implementation Bootcamp in Geneva in December 2009

      I will be running the official Oracle SOA Suite 11g Implementation Bootcamp for Oracle Partners at Digicomp in Geneva in December (9. – 11.12.2009).

      Oracle SOA Suite 11g Implementation Boot camp was designed to provide the strategic direction of Oracle’s Fusion Middleware and its role in composite application development. It exposes various components of Oracle SOA Suite 11g like Service Infrastructure Platform, Event Delivery Network, Adapter Services; business Events, Mediator Components, BPEL Components, Business Rule Components, Human Workflow and Web Services Manager.

      If you are an Oracle System Integrator Partner and you want to see the Oracle SOA Suite 11g in action, this Bootcamp is for you! Here you will find some more detailed information as well as the link to register for this event.

      I’m looking forward to see you in Geneva!