HAMS is Atlassian’s order processing system; if you’ve ever bought an Atlassian product it’s HAMS that’s been doing the work in the back-end. HAMS has served us well, but is over 3 years old now and starting to show some wear, so we’ve set aside August this year to attack some of the technical debt and upgrade the core engine. In a series of blog posts we’ll be describing some of the technologies and trade-offs in a financial-processing system.

The parable of the Merchant and the Customer…

And it came to pass that a merchant had three amphorae of oil, and he took them to market to sell.

And a customer approached the merchant wishing to buy two amphorae, and offered the merchant two shekels each for them. And the merchant agreed, say “I could haggle, but that is a fair price and I am not a clichéd literary allusion”.

The customer was so happy with the price that he decide to buy the remaining amphora for another two shekels, but when he made to leave he realised that his donkey could not hold them. And he said to the merchant, “I must return one of these amphora, for my donkey is an old beater and can’t hack it.” So the merchant gave the customer six shekels and took three amphora back. And the merchant and the customer were greatly confused.

— Transactions 4:12

It’s 11pm…

At the heart of any financial system is the concept of transactions. These delimit the start and end of a sequence of operations that must be performed atomically; specifically, they must all occur (commit) or none occur (rollback). This behaviour is fundamental to database systems, and also to other systems where integrity is important e.g. message brokers. Much of the work on HAMS 3.0 has been about defining where our transactions start and end, which resources are involved in them at any given point, and what to do when they fail. But first you have to have the ability to control your transaction boundaries. In the above parable, the merchant has engaged in a second transaction but has failed to declare it as such; consequently a roll-back event rolls-back the first transaction too; his transactions are incorrectly delimited.

In modern Java systems there are basically two methods of managing transactions; manually via API calls or declaratively via annotations. The API method gives the most control but requires a large amount of boiler-plate code (looking up the transaction manager, catching exceptions, etc). Declarative transactions use Java5’s annotations to mark method calls to be wrapped in transactions, leaving the management of those transactions up a container. In practice there are two containers available that can do transaction management; J2EE application servers and Spring. In HAMS we use the latter, and Spring’s own @Transactional annotations.

… do you know where your transactions are?

But annotations are just that, markers in the class-files that denote information; so how do they get converted into the code that calls the transaction manager?

The default mechanism for doing this in Spring is to hijack the dependency injection mechanism; instead of injecting the class containing the transactional method the container injects a proxy class that is generated at runtime. This class wraps the real class, calling the transaction manager to setup the transaction, then calling the real method, catching any exceptions and finally commiting or rolling-back. This method has a number of advantages, not least that it is simple to setup as the magic required is all handled by Spring behind the scenes. But it has one serious problem…

Consider the following batch job class (simplified, but based on a real class in HAMS):
[cc lang=”java” line_numbers=”false”]
class AutoRenewalsJob {

@Transactional  // All DB operations require a transaction
public void process() {
List accounts = accountDao.getRenewalCandidates();

for (Account acc : accounts) {
renewAccount(acc);
}
}

// Process each account in a new transaction so they are committed individually
@Transactional(propagation = Propagation.REQUIRES_NEW)
private void renewAccount(Account account) {
boolean paid = paymentProcessor.chargeForAccount(account);
if (paid) {
account.updateWithRenewal();
accountDao.save(account);
}

// Actual DB commit occurs on return
}
}
[/cc]

On the surface this seems straight-forward: we read the candidate accounts in, then create a new transaction for each account update so that each one is committed separately; in the case of an error only the most-recent account is rolled-back. Except that this isn’t what happens; if there is an error then all of the account updates will roll-back. The reason is that under Spring’s proxy-class transactions the annotations only have effect if you call a method via an injected (proxy) bean; the transactional magic all occurs within the proxy, and internal method calls are unaware of their proxy.

There are some ways of working around this but none of them are pretty; the code can look-up the proxy at run-time (which requires it being aware of the container), or you can refactor the class into multiple related classes that are then injected into each-other. The latter is the recommended method but results in artificially disjointed code; in extreme cases you end up with a class per-method. The other problem is that misuse of the @Transactional annotation is hard to detect in an automated manner; static-analysis tools may be able to pick up the problem but these are not in wide use.

However there is an alternative method of implementing transactional annotations using AspectJ. This method actually modifies the annotated classes to inject (weave in AspectJ terms) the transaction management code into the methods, guaranteeing that it will be called at runtime. There are two ways this can be done; compile-time weaving or runtime weaving. Runtime weaving is superficially attractive as it involves no modifications to the existing build environment, but in practice it requires custom classloaders to be deployed with the container; in cases where you don’t have control of the deployment environment this may not be possible. So compile-time weaving is the way to go.

Setting up AspectJ weaving with Maven

To do compile-time weaving we need to defer compilation to AspectJ’s compiler. How this is done depends on your build-system, but Maven has a plugin that does this for us by hooking into the compilation phase:

<project>
    ......
    <build>
        <plugins>
            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>aspectj-maven-plugin</artifactId>
                <version>1.0</version>
                <dependencies>
                    <dependency>
                        <groupId>org.aspectj</groupId>
                        <artifactId>aspectjrt</artifactId>
                        <version>${aspectj.version}</version>
                    </dependency>
                    <dependency>
                        <groupId>org.aspectj</groupId>
                        <artifactId>aspectjtools</artifactId>
                        <version>${aspectj.version}</version>
                    </dependency>
                </dependencies>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>test-compile</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <outxml>true</outxml>
                    <aspectLibraries>
                        <aspectLibrary>
                            <groupId>org.springframework</groupId>
                            <artifactId>spring-aspects</artifactId>
                        </aspectLibrary>
                    </aspectLibraries>
                    <source>1.6</source>
                    <target>1.6</target>
                </configuration>
            </plugin>
        </plugins>
    </build>
 
    <dependencies>
        <dependency>
            <groupId>org.aspectj</groupId>
            <artifactId>aspectjrt</artifactId>
            <version>${aspectj.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.aspectj</groupId>
            <artifactId>aspectjweaver</artifactId>
            <version>${aspectj.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring-tx</artifactId>
            <version>${spring.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring-aspects</artifactId>
            <version>${spring.version}</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>
     ....
</project>

Note that we hook into both the test-compilation as well as the deployed code. This is important because…

How do you know it’s working?

Part of the take-away from this article should be that we can’t assume that our annotations are automatically having an effect; we need to get our configuration right, which means we need to test it. We could do this by bumping up the Spring log-levels and doing a visual inspection, but there’s a better way.

Spring has a test framework that integrates with JUnit and allows you to run tests under Spring configuration. In extreme cases this can be used initialise and control a major sub-set of your application and run it through various scenarios. But in our case we’re going to use it to bring up our DAO layer and run some transaction commit/rollback scenarios to check if our new AspectJ-woven classes are doing what they’re supposed to do.

First we need to create a Spring environment that the test will run under. Ideally this should closely resemble your production configuration:

<beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       xmlns:tx="http://www.springframework.org/schema/tx"
       xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
                           http://www.springframework.org/schema/tx >
 
    <!-- First tell Spring to manage our transactions -->
    <bean class="org.springframework.orm.jpa.support.PersistenceAnnotationBeanPostProcessor"/>
    <tx:annotation-driven mode="aspectj" />
 
    <!-- We need a temporary database to work on; we use H2 as it has the closest semantics to our production DB (PostgreSQL) -->
    <bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
        <property name="driverClassName" value="org.h2.Driver" />
        <property name="url" value="jdbc:h2:mem:testdb;MVCC=TRUE;LOCK_TIMEOUT=60000"/>
    </bean>
 
    <!-- Setup the JPA environment. This is standard stuff so I won't go into it in detail -->
    <bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
        <property name="persistenceUnitName" value="hams"/>
        <property name="persistenceXmlLocation" value="classpath*:AspectJTxTest-persistence.xml"/>
        <property name="dataSource" ref="dataSource" />
        <property name="jpaVendorAdapter">
            <bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
                <property name="generateDdl" value="true"/>
                <property name="databasePlatform" value="org.hibernate.dialect.H2Dialect"/>
            </bean>
        </property>
    </bean>
 
    <bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
        <property name="entityManagerFactory" ref="entityManagerFactory"/>
    </bean>
 
    <!-- We use one of our simpler domain objects for testing -->
    <bean id="dao" class="com.atlassian.hams.dao.impl.GenericPropertyDaoImpl"/>
 
    <!-- We use DBUnit to simplify DB management -->
    <bean id="databaseConnection" class="org.dbunit.database.DatabaseConnection">
        <constructor-arg index="0">
            <bean factory-bean="dataSource" factory-method="getConnection"/>
        </constructor-arg>
    </bean>
     
</beans>

Now we need a test the configuration. To do this we create an internal-only method in the test, write some data then force a roll-back by throwing a runtime exception. We can then check the database to see what happened:

[cc lang=”java” line_numbers=”false”]
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = {“/AspectJTxTest-ctx.xml”})
@TransactionConfiguration
public class AspectJTxITCase
{
    @Resource GenericPropertyDao dao;
    @Resource protected DataSource dataSource;
    @Resource protected IDatabaseConnection databaseConnection;

    // Our internally-called method.  Note that it throws an exception, which triggers a roll-back.
    @Transactional(propagation = Propagation.REQUIRES_NEW)
    private void addGPAndThrow() {
        GenericProperty gp = new GenericProperty();
        gp.setKey(“test1”);
        gp.setTextValue(“test1”);
        dao.save(gp);
        throw new RuntimeException();
    }

    @Test
    @Transactional
    public void commitInSubtransation() {
        GenericProperty gp = dao.find(1L);
        assertNull(gp);

        try {
            addGPAndThrow();
        } catch(RuntimeException e) {
            // Ignore
        }

        // If the transaction declared for addGPAndThrow() is effective (i.e. we’re using AspectJ
        // transactions) we should still get null for find(1), as it will have been rolled-back;
        // however for Spring proxied transactions the declaration will be ignored, so we’ll catch
        // exception before it crosses a transaction boundary and triggers a rollback.
        gp = dao.find(1L);
        // assertTrue(gp.getId().equals(1L));  // This is True for Spring AOP, NOT AspectJ.
        assertNull(gp);  // True when using AspectJ
    }
}
[/cc]

Under the hood

If you’re like me you’ll want to take a peek under the hood and see what Spring and AspectJ are doing to our classes. Unfortunately the AspectJ compiler works at the byte-code level and I’ve never been able to convince it to output the intermediate Java code. However using JD we can decompile the resulting class and see the magic in action (with a little clean-up):
[cc lang=”java” line_numbers=”false”]
@Transactional(propagation=Propagation.REQUIRES_NEW)
  private void addGPAndThrow() {
      try {
          try {
              AnnotationTransactionAspect.aspectOf()
                  .ajc$before$org_springframework_transaction_aspectj_AbstractTransactionAspect$1$2a73e96c(this, ajc$tjp_1);

              GenericProperty gp = new GenericProperty();
              gp.setKey(“test1”);
              gp.setTextValue(“test1”);
              this.dao.save(gp);
              throw new RuntimeException();

          } catch (Throwable localThrowable) {
              AnnotationTransactionAspect.aspectOf()
                  .ajc$afterThrowing$org_springframework_transaction_aspectj_AbstractTransactionAspect$2$2a73e96c(this, localThrowable);
              throw localThrowable;
          }
          AnnotationTransactionAspect.aspectOf()
              .ajc$afterReturning$org_springframework_transaction_aspectj_AbstractTransactionAspect$3$2a73e96c(this);

       } catch (Throwable localThrowable1) {
          AnnotationTransactionAspect.aspectOf()
              .ajc$after$org_springframework_transaction_aspectj_AbstractTransactionAspect$4$2a73e96c(this);
          throw localThrowable1;
      }

      AnnotationTransactionAspect.aspectOf()
          .ajc$after$org_springframework_transaction_aspectj_AbstractTransactionAspect$4$2a73e96c(this);
  }
[/cc]
As we can see, AspectJ defers the setup and teardown of transactions to TransactionAspectSupport, and rollback is handled by completeTransactionAfterThrowing.

Next time…

Observant readers will have noticed that there is still a possible bug in our renewals job; although the database operation will roll-back the account may have been charged regardless, depending on how our payment-processor is implemented. Next time I’ll be talking about this, and how to do atomic and idempotent transactions across multiple resources, some of which were never designed to be transactional in the first place.


High on ACID? The Atlassian Internal Systems team is hiring; see the jobs page for details.

The road to HAMS 3.0 – Transaction boundaries