Author Archives: Nepomuk Seiler

Simple JUnit Tests with Tycho and Surefire

Eclipse Tycho requires a special packaging type for test bundles, eclipse-test-plugin. This is okay, when you have your own eclipse based project with all the modularity you want. However sometimes you have legacy libraries or want to keep your source code and test code close to each other and don’t want to create another plugin to run the tests, like in this project.

Tycho got a sure-fire plugin which doesn’t cover this case. So you need to configure good old maven surefire plugin for your needs. Before explaining, this is what the important part of the pom.xml looks like:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
<!-- plain surefire tests without tycho -->
<testSourceDirectory>src/test/java</testSourceDirectory>
<plugins>
  <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-surefire-plugin</artifactId>
    <version>2.12.4</version>
    <executions>
      <execution>
        <id>test</id>
        <phase>test</phase>
        <configuration>
          <includes>
            <include>**/*Test.java</include>
          </includes>
        </configuration>
        <goals>
          <goal>test</goal>
        </goals>
      </execution>
    </executions>
  </plugin>
 
  <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-compiler-plugin</artifactId>
    <version>2.5.1</version>
    <executions>
      <execution>
        <id>compiletests</id>
        <phase>test-compile</phase>
        <goals>
          <goal>testCompile</goal>
        </goals>
      </execution>
    </executions>
  </plugin>
  1. You must specifiy the test directory (src/test/java).
  2. You have to bind the maven compiler plugin to the test-compile phase so this directory gets compiled
  3. Activate the maven sure-fire-plugin

Thanks to this mailing-list post.

PDE target platform cache path

Sometimes you build a broken bundle and publish it on a local update site for some of your colleges. However you’re colleges have already set up their target platform and Eclipse PDE cached the bundles. PDE seems to be really smart when it comes to use cached bundles. Deleting and resetting the target platform didn’t work for me. So I want to replace the bundle in the cache, but where is the folder?

Short answer:

{workspace}/.metadata/.plugins/org.eclipse.pde.core/.bundle_pool/plugins/

How I found it:

  1. Go to your workspace
  2. find -name ‘bundle.name*’ which results in the caching directory and the place of your file
  3. Replace the incorrect bundle in your cache

Note:

This is a short hack in development environments. If a release is broken you should realize this before you publish the site. And if it happens, then update the version of your broken bundle an republish, so the newer version is fetched.

Gradient Decent with Scala

Currently I’m watching a Scala and a Maschine Learning course on coursera.org and
wanted to try some simple stuff for myself. I choose Gradient Decent would be a
perfect start to try some functional programming.

The code

import scala.math._
 
object GradientDecent extends App {
 
  val alpha = 0.1 //size of steps taken in gradient decent
  val samples = List((Vector(0.0, 0.0), 2.0), (Vector(3.0, 1.0), 12.0), (Vector(2.0, 2.0), 18.0))
 
  var tetas = Vector(0.0, 0.0, 0.0)
  for (i 
        teta - (alpha / samples.size) * samples.foldLeft(0.0) {
          case (sum, (x, y)) =&gt; decentTerm(sum, 1, x, y, tetas)
        }
      case (teta, i) =&gt;
        teta - (alpha / samples.size) * samples.foldLeft(0.0) {
          case (sum, (x, y)) =&gt; decentTerm(sum, x(i - 1), x, y, tetas)
        }
    }
  }
 
  def decentTerm(sum: Double, x_j: Double, x: Vector[Double], y: Double, tetas: Vector[Double]) = {
    sum + x_j * (h(x, tetas) - y)
  }
 
  def h(x: Vector[Double], teta: Vector[Double]): Double = {
    teta(0) + {
      for (i  sum + x)
  }
 
}

And thats pretty much everything. This is just a first version and I’m sure somebody would find ways
to optimize it. However even this hacked version is very short and handsome :)

Update
The code snippet here is a gradient decent for performing linear regression.

DevVM Part 1 – Gerrit on Ubuntu 12.04 Server

I’m currently working on a little development VM and want to share some of my insides I gain and how I managed to get things work. The series will start with the tutorial to install Gerrit.

What is Gerrit?

Gerrit provides a powerful server to integrate a code-review process in your git-driven development process. These are the main reasons I picked gerrit:

  • Support git as versioning system – awesome
  • Integration with buildservers like jenkins to run test automatically and the CI-server is a part of the code review process
  • Great Eclipse integration with EGit

Install Gerrit

All you need is root shell access to your server and a working internet connection (surprise!)

Generate gerrit2 user

First we generate a group gerrit2 and a user gerrit2 with a home directory located at /usr/local/gerrit2

sudo addgroup gerrit2
sudo adduser --system --home /usr/local/gerrit2 --shell /bin/bash --ingroup gerrit2 gerrit2

I use my own MySQL database instead of the integrated h2 database. You have to generate a user gerrit2 too and a database called reviewdb. On the shell you can do this via

mysql --user=root -p
CREATE USER 'gerrit2'@'localhost' IDENTIFIED BY 'secret';
CREATE DATABASE reviewdb;
ALTER DATABASE reviewdb charset=latin1;
GRANT ALL ON reviewdb.* TO 'gerrit2'@'localhost';
FLUSH PRIVILEGES;
exit;

Last thing to do as a root is to generate a default config file for gerrit. When

sudo touch /etc/default/gerritcodereview

and insert with a editor of your choice

GERRIT_SITE=/usr/local/gerrit2

Now we log into our gerrit2 user and install gerrit.

sudo su gerrit2
cd ~
wget http://gerrit.googlecode.com/files/gerrit-2.4.2.war
java -jar gerrit-2.4.2.war init -d /usr/local/gerrit2

The address may have altered, so check that.

Fill out everything for your needs. The database password is your secret. Check that everything works b starting gerrit with

cd ~/bin
./gerrit.sh start
./gerrit.sh stop

When everything worked fine, you can updated your init.d to start gerrit automatically on startup. You do this by the following commands.

sudo ln -snf /usr/local/gerrit2/bin/gerrit.sh /etc/init.d/gerrit
sudo update-rc.d gerrit defaults

Now your gerrit sever starts each time your machine starts.

Troubleshooting

I made some errors during the installation which almost drove me crazy.

Authentication via OpenID – Register new Email

It’s great that you can access the gerrit server with OpenID. However if you have another email on your OpenID account (like *@gmail) than you have on your ssh-key (like *@your-company.com) than you must register a new Email on your account. That does only work if your smtp-server is correctly configured.

By default gerrit uses “user@hostname” as sender. Well for me it was “gerrit@server” which isn’t a valid emailadress. You can configure your user in the user-section of gerrit.

[user]
      name = Your name
      email = name@your-company.com

Maven – Tycho, Java, Scala and APT

This tutorial shows a small project which is build with maven-tycho and the following requirements:

  • Mixed Java / Scala project
  • Eclipse plugin deployment
  • Eclipse Annotation Processing (APT)
  • Manifest-first approach
  • Java 7 / Scala 2.9.2
That doesn’t sound too hard. In fact it isn’t, if you are familiar with maven and how tycho works. 

Setting up maven

First download maven 3 and configure it.
I created two profiles in my settings.xml and added some repositories.
My two profiles are tycho-build and scala-build which are activated with
the corresponding property present.
<settings>
 <profiles>
  <profile>
   <id>tycho</id>
   <activation>
    <activeByDefault>false</activeByDefault>
    <property>
     <name>tycho-build</name>
    </property>
  </activation>
  <repositories>
   <repository>
    <id>eclipse-indigo</id>
    <layout>p2</layout>
    <url>http://download.eclipse.org/releases/indigo</url>
   </repository>
   <repository>
    <id>eclipse-sapphire</id>
    <layout>p2</layout>
    <url>http://download.eclipse.org/sapphire/0.4.1/repository</url>
   </repository>
   <repository>
    <id>eclipse-scala-ide</id>
    <layout>p2</layout>
   <url>http://download.scala-ide.org/releases-29/milestone/site</url>
  </repository>
  <repository>
   <id>eclipse-gemini-dbaccess</id>
   <layout>p2</layout>
   <url>http://download.eclipse.org/gemini/dbaccess/updates/1.0</url>
   </repository>
  </repositories>
 </profile>
 
 <profile>
  <id>scala</id>
  <activation>
   <activeByDefault>false</activeByDefault>
    <property>
     <name>scala-build</name>
    </property>
   </activation>
  <repositories>
   <repository>
    <id>scala-tools.org</id>
    <name>Scala-tools Maven2 Repository</name>
    <url>http://scala-tools.org/repo-releases</url>
   </repository>
   <repository>
    <id>typesafe</id>
    <name>Typesafe Repository</name>
    <url>http://repo.typesafe.com/typesafe/releases/</url>
   </repository>
  </repositories>
 <pluginRepositories>
  <pluginRepository>
    <id>scala-tools.org</id>
    <name>Scala-tools Maven2 Repository</name>
    <url>http://scala-tools.org/repo-releases</url>
   </pluginRepository>
  </pluginRepositories>
 </profile>
</profiles>
</settings>

Setting up the project – The tycho build

For my project I just used two simple plugins. Nothing fancy here.
  1. Create plugin-project
  2. Add some dependencies
  3. Write some classes in Java
I recommend the following project structure
root-project/
 plugin.core
 plugin.ui
 plugin.xy
go to your root-project folder in your favorite console and use the following command to generate pom.xml with tycho.
mvn org.sonatype.tycho:maven-tycho-plugin:generate-poms -DgroupId=de.mukis -Dtycho.targetPlatform=path/to/target/platform/
which generates a first project for you. A few things to “tweak” as I saw it as a best-practice in most of the other tutorials:
  • Replace all concrete version numbers with property placeholders, e.g 0.12.0 with ${tycho.version}
  • Remove all groupId and version tags in the pom.xml. The parent pom.xml will generate these.
  • Check your folder structure. Tycho infers AND changes your source directory according to your build.properties.
Next add the p2 repositories needed to resolve all dependencies. This is done via the <repository> tag. The full pom.xml is at the end.
Sometimes you have existing OSGi bundles but no p2 repository you can use it. Eclipse PDE has a nice extra feature for you. Features and bundles publisher application. Note: It’s very important that your repository folder has two folder plugins and features.
Now you can run your maven build with
mvn clean package
and you will get a nice packaged osgi bundle.

Setting up the project – The scala build

So now we want to add some Scala classes. Create new source folder src/main/scala and create some classes. Don’t forget to import Scala packages. So your MANIFEST.MF contains something like:
Import-Package: org.osgi.framework;version="1.6.0",
 scala;version="[2.9.0.1,2.9.3.0]",
 scala.collection;version="[2.9.0.1,2.9.3.0]",
 scala.collection.generic;version="[2.9.0.1,2.9.3.0]",
 scala.collection.immutable;version="[2.9.0.1,2.9.3.0]",
 scala.collection.interfaces;version="[2.9.0.1,2.9.3.0]",
 scala.collection.mutable;version="[2.9.0.1,2.9.3.0]",
 scala.collection.parallel;version="[2.9.0.1,2.9.3.0]",
 scala.collection.parallel.immutable;version="[2.9.0.1,2.9.3.0]",
 scala.collection.parallel.mutable;version="[2.9.0.1,2.9.3.0]",
 scala.concurrent;version="[2.9.0.1,2.9.3.0]",
 scala.concurrent.forkjoin;version="[2.9.0.1,2.9.3.0]",
 scala.io;version="[2.9.0.1,2.9.3.0]",
 scala.math;version="[2.9.0.1,2.9.3.0]",
 scala.parallel;version="[2.9.0.1,2.9.3.0]",
 scala.ref;version="[2.9.0.1,2.9.3.0]",
 scala.reflect,
 scala.reflect.generic;version="[2.9.0.1,2.9.3.0]",
 scala.runtime;version="[2.9.0.1,2.9.3.0]",
 scala.text;version="[2.9.0.1,2.9.3.0]",
 scala.util;version="[2.9.0.1,2.9.3.0]",
No there are, too alternatives to build. I choose to add the source folder in my build.properties and exclude the .scala files in my maven pom. The alternative is described here.
We need the maven scala plugin. Add the repository
...
 <repository>
  <id>scala-tools.org</id>
  <name>Scala-tools Maven2 Repository</name>
  <url>http://scala-tools.org/repo-releases</url>
 </repository>
...
 <pluginRepository>
  <id>scala-tools.org</id>
  <name>Scala-tools Maven2 Repository</name>
  <url>http://scala-tools.org/repo-releases</url>
 </pluginRepository>
and to our root pom.xml we add the maven-scala-plugin
<plugin>
 <groupId>org.scala-tools</groupId>
 <artifactId>maven-scala-plugin</artifactId>
 <version>2.15.0</version>
 <executions>
  <execution>
   <id>compile</id>
   <goals>
    <goal>compile</goal>
   </goals>
   <phase>compile</phase>
  </execution>
 
  <execution>
   <id>test-compile</id>
   <goals>
    <goal>testCompile</goal>
   </goals>
   <phase>test-compile</phase>
  </execution>
 
  <execution>
   <phase>process-resources</phase>
   <goals>
    <goal>compile</goal>
   </goals>
  </execution>
 </executions>
</plugin>
There is actually an easier version, but which doesn’t work with circular dependencies.
If you have added the src/main/scala folder in your build.properties, than you have to add another plugin, to prevent tycho from exporting all scala source files.
<plugin>
 <groupId>org.eclipse.tycho</groupId>
 <artifactId>tycho-compiler-plugin</artifactId>
 <version>${tycho.version}</version>
 <configuration>
  <excludeResources>
   <excludeResource>**/*.scala</excludeResource>
  </excludeResources>
 </configuration>
</plugin>
Now the build should work with scala, too.

Setting up the project – APT code generation with Eclipse Sapphire

I’m creating some models with Eclipse Sapphire which uses Java Annotation Processing (APT) to generate the models. Apt-maven-plugin is a maven allows us to trigger a processing factory during the build process. The current version alpha-04 has a bug which leads to an error with java 7. So, before we can use this plugin you have to checkout the source code and build the latest alpha-05 version as it’s not released at the moment. Install it in your local maven repository.
Now you can add the apt-maven-plugin to your plugin which needs apt. This could look like
<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
 
<parent>
 <groupId>de.lmu.ifi.dbs.knowing</groupId>
 <artifactId>Knowing</artifactId>
 <version>0.1.4-SNAPSHOT</version>
</parent>
 
<artifactId>de.lmu.ifi.dbs.knowing.core</artifactId>
<packaging>eclipse-plugin</packaging>
 
<build>
 <plugins>
  <plugin>
   <groupId>org.codehaus.mojo</groupId>
   <artifactId>apt-maven-plugin</artifactId>
   <version>1.0-alpha-5-SNAPSHOT</version>
   <executions>
    <execution>
     <goals>
      <goal>process</goal>
     </goals>
    </execution>
   </executions>
   <configuration>
  <factory>org.eclipse.sapphire.sdk.build.processor.internal.APFactory</factory>
   </configuration>
  </plugin>
 </plugins>
</build>
</project>
At last you have  to add the factory as optional dependencies to your MANIFEST.MF of your plugin using apt.
org.eclipse.sapphire.sdk;bundle-version="[0.4.0,0.5.0)";resolution:=optional,
org.eclipse.sapphire.sdk.build.processor;bundle-version="[0.4.0,0.5.0)";resolution:=optional
I you trigger the build, you will see that your apt sources are generated in target/generated-sources/apt. However the files are not compiled. At first I tried the maven-build-helper, but tycho seems to override these settings. So i added target/generated-sources/apt to the build.properties of the plugin using apt, which seems for my as a bad work-around. However it works fine.

Source Code

You can find the code in my github repository.

Conclusion

For a beginner it was not that easy to avoid all little traps with tycho, scala, maven apt. But in the end I hope to safe a lot of time when building and testing.

Things to add

The tutorial doesn’t include any testing.

Links

https://github.com/muuki88/tycho
http://wiki.eclipse.org/Tycho/Reference_Card
http://mattiasholmqvist.se/2010/02/building-with-tycho-part-1-osgi-bundles/
https://github.com/misto/Scala-Hello-World-Plug-in
Compiling circular dependent java-scala classes
Eclipse sapphire and tycho
compile generated sources
http://mojo.codehaus.org/apt-maven-plugin/
APT M2E Connector
Publish pre-compiled bundles in p2 repository

Software Development at Universities

A lot of frameworks and applications are developed at universities. Some have practical reasons, some are developed to support researching activities or are the main goal of the research process and weren’t even planned at the beginning.

Developing is good as students need to gain practices, because without practice you never get a good software developer, engineer or architect. However there are some points I’m missing too often at university.

Collaboration

I regularly meet people which are in the 4,5 or 6 semester and ask me what a version control system is. Most of them manage their code on only one machine. With the upcoming cloud stores like Dropbox, or Google Drive, people start synchronizing with this repository (which I highly recommend against). In my university there’s one course where you have to code together with others. However techniques like code reviews, issue tracking, mailing list and other vcs as SVN aren’t mentioned. At the beginning it’s hard to learn but pays off in the long run.

Presentation

I don’t want to write documentation, because that’s just a little part. As far as I got to know projects from universities the one with the most appealing website, nice and small tutorials, little sample applications have the highest impact. Information science departments can be compared with a foundation like Eclipse. The have some projects developed in their name and organize employees, external coworkers, code and public relations. Like the Eclipse Foundation the departments should provide an infrastructure for students and PhDs to present their work on the website, including wiki, issue tracker, code repository, documentation, forum, mailing list and website. If the department is not able to provide such an Infrastructure it should consider renting space for example at GitHub, which provides all these features.

Realworld

University is for research and trying things out. A lot of programmers however love to do fancy stuff without any purpose. Just because “it’s cool and it works”. If you really aim to achieve something with your project you must have some real world applications that make use of your research results.

Google before you code

There’s almost always someone who tried to code something you want to do. Search for it! Google (Code), GitHub, Bitbucket, Sourceforge or other code repositories. This is especially helpful for small task which aren’t really part of your work. The are some rules of thumb I use to pick projects I want to use in my projects

  1. Open Source
    Nothing is more frustrating, when you search for errors, memory leaks or implementation details and can’t look inside the source code.
  2. When was the last commit
    I try to avoid projects which are inactive for more than 6 months. 
  3. Does it implement some specification which his well known or widely used
    I think it’s highly recommend to choose projects which implement some specification (e.g. OSGi, JPA, JAX-RS,..), because you can easily switch implementation providers. 
  4. Is the licence compatible with my project
    This is necessary, if you like it or not.   

A list of companies which provide a rich set of libraries, mostly for Java, are

  • Google (Guava, Commons, Protobuf,…)
  • Typesafe (Scala, Akka, Config, …)
  • Apache (Commons, Axis2, Camel, Hadoop,…)
  • Twitter (Bootstrap, scala util,…)
  • Eclipse (EMF, Sapphire, EclipseLink,…)
There are plenty more of course, this is just what I came up first to my mind. Using libraries makes your code cleaner, easier to maintain and you can focus on the problems your want to solve. 

All this stuff could be summarized as “working more professional”. Software development is just too complex to lose time with inefficient work. Even universities are more theoretical, they should motivated students more to form groups and start small software projects on their on to experience software development.

Deploying Eclipse RCP – The hard way

Today I had the most painful PDE headless build ever. This post just describes what I had to do to deploy an Eclipse RCP application.

The Application and Technology Stack

My application, Medmon, contains about 20 bundles. Most of the are written in plain Java (JDK 7). Some of them are mixed projects with Scala (2.9.1.final). Medmon is a simple CRUD application with an embedded derby db for backend. It relies heavily on OSGi services and uses this concept whereever possible. As data persistence layer we use Eclipse Gemini JPA and Eclipse Gemini DBAccess which provide enterprise persistence for OSGi systems. One plugin uses code generation via the Eclipse Annotation Processor, because the model is implemented with Eclipse Sapphire. Finally the Eclipse version we’re using ist 3.7.1. So I summarize the technology stack:

  • Eclipse Indigo (3.7.1) with Scala IDE Plugin, Sapphire Plugin and Git Plugin
  • Persistence Layer: Eclipse Gemini JPA / DBAccess, EclipseLink 2.3.0
  • Scala 2.9.1.final
  • JDK 7

The product

At the beginning there is the product definition. Defining this for an existing Eclipse Application and your bundles organized in Features is really simple. We added a two vm arguments:

-DREFRESH_BUNDLES=false

Prevents Eclipse Gemini from refresh bundles which causes UI extension points to randomly
disappear.

-Dderby.system.home=${system_property:user.home}${system_property:file.separator}".derby"

Sets the derby.system.home variable which is used to resolve the embedded db directory if not given absolute. The ${…} is resolved at runtime via Eclipse.

The medmon.derby plugin has a little specialty in its MANIFEST.MF

Eclipse-BundleShape: dir

which deploys the plugin as a directory instead of a JAR file.

Let’s deploy it!

Okay. I want to test in on my Ubuntu machine. Using Eclipse Export Product wizard is easy. The wizard generates a repository and an application folder. I’m now writing down every single step I had to do to run this product.

1. Redeploy all plugins with Scala code

The PDE headless build ignores the Scala compiler and just compiles all java classes. However it copies all .scala files in your JAR. If you want to or not. You have to redeploy your plugins with the deploy plugin wizard and select use compiled classes from workspace. Now you got class plus scala files in your jar. Not good but better.

2. One plugin isn’t compiled completly

One plain Java plugin didn’t have any class classfiles in its deployed jar. I had to copy it manually in the jar, where I found a bug in Ubuntus Archive Manager: If you copy a directory recursive via drag and drop into the jar file only the leaf files are copied. So I have a package x and a subpackage x.y. But only classes in x.y where copied.

3. Eclipse-BundleShape: dir – only works deploying meta-repository

If you don’t enable the “generate meta repository” option this MANIFEST option will be ignored.

4. Placeholder in eclipse.ini aren’t translated in deployed products

The VM argument

-Dderby.system.home=${system_property:user.home}${system_property:file.separator}".derby"

isn’t parsed at runtime in a deployed product. Derby tries to create a database relative to the execution folder his folders. Change eclipse.ini.

5. Problems not mentioned or not tested yet

Final notes

I know there’s maven. I know there are build servers. I know there’s Tycho and I know there’s m2scala. Someday I will migrate this project. What annoys me the most is, that everything fails so silently. I’m a very young programmer and I’m trying to do my best to get better. What are your experiences with Eclipse PDE headless build?

Akka and OSGi development in Eclipse

This short tutorial is about how to run akka in an OSGi environment. I faced
a lot of problems deploying in this in plain eclipse without maven, bnd or sbt.

This example is done with the java-API, however it is also possible with Scala.

Requirements

  • Eclipse Helios 3.6.2 with Scala-Plugin
  • akka-1.1-modules distribution

Configuration

First we have to do some minor changes in some Manifest files in the akka project.

  1. Extract akka-modules-1.1.zip, e.g ~/akka
  2. go to akka/lib_managed/compile
  3. open akka-actor-1.1.jar -> META-INF/MANIFEST.MF
  4. delete following line: private-package: *
  5. Do the same with akka-typed-actor-1.1.jar

Second you have to setup a target-platform which is used to run the OSGi environment.

  1. Go to windows->Preferences->Plugin development->Target Platform
  2. Add target platform, use default
  3. Extract your akka-modules-1.1.zip, e.g ~/akka
You need the follow plugins:
  1. guice-all-2.0.jar
  2. logback-classic-0.9.24.jar
  3. logback-core-0.9.24.jar
  4. slf4j-api-1.6.0.jar
  5. Aspectwerkz by Jonas Bonér

The bundle

Create a new plugin project. No contributions to the UI and an activator class.

Copy the following libs into your bundle and add them to your classpath in MANIFEST.MF

  • akka-actor-1.1.jar
  • akka-typed-actor-1.1.jar
  • akka-slf4j-1.1.jar

Create a class MyActor

import akka.actor.UntypedActor;
 
public class MyActor extends UntypedActor {
 
	@Override
	public void onReceive(Object msg) throws Exception {
		System.out.println("Message: " + msg);
	}
 
}

Add these lines to your Activator class.

 

import akka.actor.ActorRef;
import akka.actor.Actors;
 
//...
 
	public void start(BundleContext bundleContext) throws Exception {
		Activator.context = bundleContext;
		ActorRef actor = Actors.actorOf(MyActor.class).start();
		actor.sendOneWay("Hello You");
	}

At last you have to edit the MANIFEST.MF.  It should look something like this. (I know
I may have to the smallest set of import scala packages).

Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Core
Bundle-SymbolicName: de.lmu.ifi.dbs.knowing.core;singleton:=true
Bundle-Version: 1.0.0.qualifier
Bundle-Activator: de.lmu.ifi.dbs.knowing.core.internal.Activator
Require-Bundle: org.eclipse.core.runtime,
 se.scalablesolutions.akka.actor;bundle-version="1.0.0",
 se.scalablesolutions.akka.stm;bundle-version="1.0.0",
 se.scalablesolutions.akka.typed.actor;bundle-version="1.0.0"
Bundle-ActivationPolicy: lazy
Bundle-RequiredExecutionEnvironment: JavaSE-1.6
Bundle-ClassPath: .
Import-Package: scala;version="2.9.0.1",
 scala.collection;version="2.9.0.l",
 scala.collection.generic;version="2.9.0.1",
 scala.collection.immutable;version="2.8.1.final",
 scala.collection.interfaces;version="2.8.1.final",
 scala.collection.mutable;version="2.8.1.final",
 scala.compat;version="2.8.1.final",
 scala.concurrent;version="2.8.1.final",
 scala.concurrent.forkjoin;version="2.8.1.final",
 scala.io;version="2.8.1.final",
 scala.math;version="2.8.1.final",
 scala.mobile;version="2.8.1.final",
 scala.ref;version="2.8.1.final",
 scala.reflect;version="2.8.1.final",
 scala.reflect.generic;version="2.8.1.final",
 scala.runtime;version="2.8.1.final",
 scala.text;version="2.8.1.final",
 scala.util;version="2.8.1.final",
 scala.util.automata;version="2.8.1.final",
 scala.util.continuations;version="2.8.1.final",
 scala.util.control;version="2.8.1.final",
 scala.util.grammar;version="2.8.1.final",
 scala.util.matching;version="2.8.1.final",
 scala.util.parsing.ast;version="2.8.1.final",
 scala.util.parsing.combinator;version="2.8.1.final",
 scala.util.parsing.combinator.lexical;version="2.8.1.final",
 scala.util.parsing.combinator.syntactical;version="2.8.1.final",
 scala.util.parsing.combinator.testing;version="2.8.1.final",
 scala.util.parsing.combinator.token;version="2.8.1.final",
 scala.util.parsing.input;version="2.8.1.final",
 scala.util.parsing.json;version="2.8.1.final",
 scala.util.parsing.syntax;version="2.8.1.final",
 scala.util.regexp;version="2.8.1.final"

Now let’s run this!

Launch configuration

  1. Open Run->Launch configurtions.
  2. Create a new OSGi Launch configuration
  3. Add the following bundles
    1. org.scala-ide.scala.library (2.8.1) (the akka scala library didn’t work for me)
    2. se.scalablesolutions.akka.actor
    3. se.scalablesolutions.akka.osgi.dependencies.bundle
    4. se.scalablesolutions.akka.actor.typed.actor
    5. se.scalablesolutions.akka.actor.stm
    6. com.google.inject
    7. Equinox Runtime Components (e,g eclipse.runtime.core,..)
  4. Try to launch

 

Hope this works for you, too!

Eclipse Gemini JPA Tutorial

After my test I will start writing a tutorial with a sample application for the Eclipse Gemini Project.

Currently you can find the checkout the SVN Repository under:
https://svn.cip.ifi.lmu.de/~seilern/svn/org.eclipse.gemini.jpa

Good luck,
Muki

UI Extension via Extension Points in Eclipse RCP

Eclipse has a powerful mechanism to allow Plugins to contribute to the UI: Extensions and Extension Points. There are a lot of excellent tutorials like Eclipse Extensions by Lars Vogel on the internet. However this little tutorial is about how to contribute  to an Editor (in this case an additional TabItem).

1. The Extension Interface

First we have to create an Interface which the Extension has to implement. To create an additional Tab in an Editor I created an Interface like this:

public interface IEditorTabExtension {
 
	/**
	 * Is called to create the tab control
	 * @param parent
	 * @return Control - The created Control
	 */
	public Control createContents(Composite parent);
 
	/**
	 * Should be called by the doSave method in
	 * the root EditorPart
	 *
	 * @param monitor
	 */
	public void doSave(IProgressMonitor monitor);
 
	/**
	 * Call-by-Reference dirty boolean. Indicates
	 * if changes were made.
	 *
	 * @param dirty
	 */
	public void setDirty(Boolean dirty);
 
	/**
	 *
	 * @return Name for the Tab
	 */
	public String getName();
}

2. Create the Extension Point

First we create an Extension Point in the plugin.xml via the plugin.xml Editor.

Create Extension Point

The Extension-Schema Editor should now open automatically. Otherwise there’s a button.
Add a new Element and call it “tab”. Now add a new attribute and name it “class”. Type should be “java” and Implements
our IEditorTabExtension. Don’t forget to create a new attribute Choice in the “extension” element. And in there an
“Tab” entry. Now it should look like this:

Extension Point Elements

3. Create an Extension and provide it

Our Plugin can not only provide an extension point, it provides an extension too. Feel free to
implement the Interface with an UI you like. To register this Extension open the plugin.xml
and the Extensions Tab. Add our new Extension Point de.mukis.editor.EditorTabExtension.
Should look like this:

Provide Extension

4. Evaluate Contribs and add it to the Editor

private IEditorTabExtension[] extensions;
 
	@Override
	public void doSave(IProgressMonitor monitor) {
		dirty = false;
		for(IEditorTabExtension e : extensions)
			e.doSave(monitor);
		firePropertyChange(IWorkbenchPartConstants.PROP_DIRTY);
	}
 
	@Override
	public void createPartControl(Composite parent) {
		folder = new TabFolder(parent, SWT.BORDER);
 
		extensions = evaluateTabContribs();
		for (IEditorTabExtension e : extensions) {
			TabItem tab = new TabItem(folder, SWT.BORDER);
			tab.setText(e.getName());
			tab.setControl(e.createContents(folder));
			System.out.println("Tab added");
		}
 
	}
 
 private IChildEditorTabExtension[] evaluateTabContribs() {
		IConfigurationElement[] config = Platform.getExtensionRegistry()
				.getConfigurationElementsFor(TAB_ID);
		final LinkedList list = new LinkedList();
		try {
			for(IConfigurationElement e : config) {
				System.out.println("Evaluation extension");
				final Object o = e.createExecutableExtension("class");
				if(o instanceof IEditorTabExtension) {
					ISafeRunnable runnable = new ISafeRunnable() {
 
						@Override
						public void handleException(Throwable exception) {
							System.out.println("Exception in Tab");
						}
 
						@Override
						public void run() throws Exception {
							IEditorTabExtension tab = (IEditorTabExtension)o;
							list.add(tab);
							System.out.println("Extension detected: " + tab.getName());
						}
					};
					SafeRunner.run(runnable);
				}
			}
		} catch(CoreException ex) {
			System.out.println(ex.getMessage());
		}
		return list.toArray(new IChildEditorTabExtension[list.size()]);
	}

This is very basic. The isDirty flag solution isn’t very smart. We use the Call-by-Reference effect
to provide a “global” Boolean.

Thanks to Lars Vogel’s tutorials which inspired me to to my own stuff  and have been used
for this tutorial.