Archiv nach Kategorien: Tutorials

How to add a maven-plugin jar as dependency to sbt

I want to use the jdeb library to integrate in one of my own libraries.
Since it is a maven-plugin it’s packaged as a maven plugin. SBT
does not resolve the jars if you just add it as a dependency. This
will do the trick:

"org.vafer" % "jdeb" % "1.2" artifacts (Artifact("jdeb", "jar", "jar"))

This one is for sbt 0.13.5!

Playframework and RequireJS

RequireJS Logo

As a backend developer I like tools that help me to structure my code. Doing more and more frontend stuff I finally got time to learn some of the basics of RequireJS. Unfortunately the tutorials how to compose playframework and requireJS with multipage applications is not too big. There’s is some with AngularJS, but I didn’t want to port my applications to two new systems.

Application structure

For a sample application, I implemented to pages:

Both will have their own data-entry-point and dependencies. The index page looks like this

@(message: String)
 
@main("RequireJS with Play") {
    // html here
 
 @helper.requireJs(core = routes.Assets.at("javascripts/require.js").url,
                   module = routes.Assets.at("javascripts/main/main").url)
 
}

The routes file is very basic, too:

GET    /                controllers.Application.index
GET    /dashboard       controllers.Application.dashboard
POST   /api/sample      controllers.Application.sample
 
### Additions needed
GET    /jsroutes.js     controllers.Application.jsRoutes()
### Enable www.WebJars.org based resources to be returned
GET    /webjars/*file   controllers.WebJarAssets.at(file)
GET    /assets/*file    controllers.Assets.at(path="/public", file)

The javascript folder layout

  • assets/javascripts
    • common.js
    • main.js
    • dashboard
      • chart.js
      • main.js
    • lib
      • math.js

How does it work?

First you define a file common.js, which is used to configure requirejs.

(function(requirejs) {
    "use strict";
 
    requirejs.config({
        baseUrl : "/assets/javascripts",
        shim : {
            "jquery" : {
                exports : "$"
            },
            "jsRoutes" : {
                exports : "jsRoutes"
            }
        },
        paths : {
            "math" : "lib/math",
            // Map the dependencies to CDNs or WebJars directly
            "_" : "//cdnjs.cloudflare.com/ajax/libs/underscore.js/1.5.1/underscore-min",
            "jquery" : "//localhost:9000/webjars/jquery/2.0.3/jquery.min",
            "bootstrap" : "//netdna.bootstrapcdn.com/bootstrap/3.0.0/js/bootstrap.min",
            "jsRoutes" : "//localhost:9000/jsroutes"
        // A WebJars URL would look like
        // //server:port/webjars/angularjs/1.0.7/angular.min
        }
    });
 
    requirejs.onError = function(err) {
        console.log(err);
    };
})(requirejs);

The baseUrl is important, as this will be the root path from now on. IMHO this makes things easier than, relative paths.

The shim configuration is used to export your jsRoutes, which is defined in my Application.scala file. Of course you can add as many as you want.

The paths section is a bit tricky. Currently it seems there’s no better way than hardcoding the urls, like “jsRoutes” : “//localhost:9000/jsroutes”, when you use WebJars.

Define and Require

Ordering is crucial! For my /dasbhoard page the /dasbhoard/main.js is my entry point

// first load the configuration
require(["../common"], function(common) {
   console.log('Dashboard started');
 
   // Then load submodules. Remember the baseUrl is set:
   // Even you are in the dasboard folder you have to reference dashboard/chart
   // directly
   require(["jquery", "math", "dashboard/chart"], function($, math, chart){
       console.log("Title is : " + $('h1').text());
       console.log("1 + 3 = " + math.sum(1,3));
       console.log(chart);
 
       chart.load({ page : 'dashboard'}, function(data){
           console.log(data);
       }, function(status, xhr, error) {
           console.log(status);
       });
 
   });
});

For the chart.js

// first the configuration, then other dependencies
define([ "../common", "jsRoutes" ], {
    load : function(data, onSuccess, onFail) {
        var r = jsRoutes.controllers.Application.sample();
        r.contentType = 'application/json';
        r.data = JSON.stringify(data);
        $.ajax(r).done(onSuccess).fail(onFail);
    }
})

Links

Gradient Decent with Scala

Currently I’m watching a Scala and a Maschine Learning course on coursera.org and
wanted to try some simple stuff for myself. I choose Gradient Decent would be a
perfect start to try some functional programming.

The code

import scala.math._
 
object GradientDecent extends App {
 
  val alpha = 0.1 //size of steps taken in gradient decent
  val samples = List((Vector(0.0, 0.0), 2.0), (Vector(3.0, 1.0), 12.0), (Vector(2.0, 2.0), 18.0))
 
  var tetas = Vector(0.0, 0.0, 0.0)
  for (i 
        teta - (alpha / samples.size) * samples.foldLeft(0.0) {
          case (sum, (x, y)) => decentTerm(sum, 1, x, y, tetas)
        }
      case (teta, i) =>
        teta - (alpha / samples.size) * samples.foldLeft(0.0) {
          case (sum, (x, y)) => decentTerm(sum, x(i - 1), x, y, tetas)
        }
    }
  }
 
  def decentTerm(sum: Double, x_j: Double, x: Vector[Double], y: Double, tetas: Vector[Double]) = {
    sum + x_j * (h(x, tetas) - y)
  }
 
  def h(x: Vector[Double], teta: Vector[Double]): Double = {
    teta(0) + {
      for (i  sum + x)
  }
 
}

And thats pretty much everything. This is just a first version and I’m sure somebody would find ways
to optimize it. However even this hacked version is very short and handsome :)

Update
The code snippet here is a gradient decent for performing linear regression.

DevVM Part 1 – Gerrit on Ubuntu 12.04 Server

I’m currently working on a little development VM and want to share some of my insides I gain and how I managed to get things work. The series will start with the tutorial to install Gerrit.

What is Gerrit?

Gerrit provides a powerful server to integrate a code-review process in your git-driven development process. These are the main reasons I picked gerrit:

  • Support git as versioning system – awesome
  • Integration with buildservers like jenkins to run test automatically and the CI-server is a part of the code review process
  • Great Eclipse integration with EGit

Install Gerrit

All you need is root shell access to your server and a working internet connection (surprise!)

Generate gerrit2 user

First we generate a group gerrit2 and a user gerrit2 with a home directory located at /usr/local/gerrit2

sudo addgroup gerrit2
sudo adduser --system --home /usr/local/gerrit2 --shell /bin/bash --ingroup gerrit2 gerrit2

I use my own MySQL database instead of the integrated h2 database. You have to generate a user gerrit2 too and a database called reviewdb. On the shell you can do this via

mysql --user=root -p
CREATE USER 'gerrit2'@'localhost' IDENTIFIED BY 'secret';
CREATE DATABASE reviewdb;
ALTER DATABASE reviewdb charset=latin1;
GRANT ALL ON reviewdb.* TO 'gerrit2'@'localhost';
FLUSH PRIVILEGES;
exit;

Last thing to do as a root is to generate a default config file for gerrit. When

sudo touch /etc/default/gerritcodereview

and insert with a editor of your choice

GERRIT_SITE=/usr/local/gerrit2

Now we log into our gerrit2 user and install gerrit.

sudo su gerrit2
cd ~
wget http://gerrit.googlecode.com/files/gerrit-2.4.2.war
java -jar gerrit-2.4.2.war init -d /usr/local/gerrit2

The address may have altered, so check that.

Fill out everything for your needs. The database password is your secret. Check that everything works b starting gerrit with

cd ~/bin
./gerrit.sh start
./gerrit.sh stop

When everything worked fine, you can updated your init.d to start gerrit automatically on startup. You do this by the following commands.

sudo ln -snf /usr/local/gerrit2/bin/gerrit.sh /etc/init.d/gerrit
sudo update-rc.d gerrit defaults

Now your gerrit sever starts each time your machine starts.

Troubleshooting

I made some errors during the installation which almost drove me crazy.

Authentication via OpenID – Register new Email

It’s great that you can access the gerrit server with OpenID. However if you have another email on your OpenID account (like *@gmail) than you have on your ssh-key (like *@your-company.com) than you must register a new Email on your account. That does only work if your smtp-server is correctly configured.

By default gerrit uses “user@hostname” as sender. Well for me it was “gerrit@server” which isn’t a valid emailadress. You can configure your user in the user-section of gerrit.

[user]
      name = Your name
      email = name@your-company.com

Maven – Tycho, Java, Scala and APT

This tutorial shows a small project which is build with maven-tycho and the following requirements:

  • Mixed Java / Scala project
  • Eclipse plugin deployment
  • Eclipse Annotation Processing (APT)
  • Manifest-first approach
  • Java 7 / Scala 2.9.2
That doesn’t sound too hard. In fact it isn’t, if you are familiar with maven and how tycho works. 

Setting up maven

First download maven 3 and configure it.
I created two profiles in my settings.xml and added some repositories.
My two profiles are tycho-build and scala-build which are activated with
the corresponding property present.
<settings>
 <profiles>
  <profile>
   <id>tycho</id>
   <activation>
    <activeByDefault>false</activeByDefault>
    <property>
     <name>tycho-build</name>
    </property>
  </activation>
  <repositories>
   <repository>
    <id>eclipse-indigo</id>
    <layout>p2</layout>
    <url>http://download.eclipse.org/releases/indigo</url>
   </repository>
   <repository>
    <id>eclipse-sapphire</id>
    <layout>p2</layout>
    <url>http://download.eclipse.org/sapphire/0.4.1/repository</url>
   </repository>
   <repository>
    <id>eclipse-scala-ide</id>
    <layout>p2</layout>
   <url>http://download.scala-ide.org/releases-29/milestone/site</url>
  </repository>
  <repository>
   <id>eclipse-gemini-dbaccess</id>
   <layout>p2</layout>
   <url>http://download.eclipse.org/gemini/dbaccess/updates/1.0</url>
   </repository>
  </repositories>
 </profile>
 
 <profile>
  <id>scala</id>
  <activation>
   <activeByDefault>false</activeByDefault>
    <property>
     <name>scala-build</name>
    </property>
   </activation>
  <repositories>
   <repository>
    <id>scala-tools.org</id>
    <name>Scala-tools Maven2 Repository</name>
    <url>http://scala-tools.org/repo-releases</url>
   </repository>
   <repository>
    <id>typesafe</id>
    <name>Typesafe Repository</name>
    <url>http://repo.typesafe.com/typesafe/releases/</url>
   </repository>
  </repositories>
 <pluginRepositories>
  <pluginRepository>
    <id>scala-tools.org</id>
    <name>Scala-tools Maven2 Repository</name>
    <url>http://scala-tools.org/repo-releases</url>
   </pluginRepository>
  </pluginRepositories>
 </profile>
</profiles>
</settings>

Setting up the project – The tycho build

For my project I just used two simple plugins. Nothing fancy here.
  1. Create plugin-project
  2. Add some dependencies
  3. Write some classes in Java
I recommend the following project structure
root-project/
 plugin.core
 plugin.ui
 plugin.xy
go to your root-project folder in your favorite console and use the following command to generate pom.xml with tycho.
mvn org.sonatype.tycho:maven-tycho-plugin:generate-poms -DgroupId=de.mukis -Dtycho.targetPlatform=path/to/target/platform/
which generates a first project for you. A few things to “tweak” as I saw it as a best-practice in most of the other tutorials:
  • Replace all concrete version numbers with property placeholders, e.g 0.12.0 with ${tycho.version}
  • Remove all groupId and version tags in the pom.xml. The parent pom.xml will generate these.
  • Check your folder structure. Tycho infers AND changes your source directory according to your build.properties.
Next add the p2 repositories needed to resolve all dependencies. This is done via the <repository> tag. The full pom.xml is at the end.
Sometimes you have existing OSGi bundles but no p2 repository you can use it. Eclipse PDE has a nice extra feature for you. Features and bundles publisher application. Note: It’s very important that your repository folder has two folder plugins and features.
Now you can run your maven build with
mvn clean package
and you will get a nice packaged osgi bundle.

Setting up the project – The scala build

So now we want to add some Scala classes. Create new source folder src/main/scala and create some classes. Don’t forget to import Scala packages. So your MANIFEST.MF contains something like:
Import-Package: org.osgi.framework;version="1.6.0",
 scala;version="[2.9.0.1,2.9.3.0]",
 scala.collection;version="[2.9.0.1,2.9.3.0]",
 scala.collection.generic;version="[2.9.0.1,2.9.3.0]",
 scala.collection.immutable;version="[2.9.0.1,2.9.3.0]",
 scala.collection.interfaces;version="[2.9.0.1,2.9.3.0]",
 scala.collection.mutable;version="[2.9.0.1,2.9.3.0]",
 scala.collection.parallel;version="[2.9.0.1,2.9.3.0]",
 scala.collection.parallel.immutable;version="[2.9.0.1,2.9.3.0]",
 scala.collection.parallel.mutable;version="[2.9.0.1,2.9.3.0]",
 scala.concurrent;version="[2.9.0.1,2.9.3.0]",
 scala.concurrent.forkjoin;version="[2.9.0.1,2.9.3.0]",
 scala.io;version="[2.9.0.1,2.9.3.0]",
 scala.math;version="[2.9.0.1,2.9.3.0]",
 scala.parallel;version="[2.9.0.1,2.9.3.0]",
 scala.ref;version="[2.9.0.1,2.9.3.0]",
 scala.reflect,
 scala.reflect.generic;version="[2.9.0.1,2.9.3.0]",
 scala.runtime;version="[2.9.0.1,2.9.3.0]",
 scala.text;version="[2.9.0.1,2.9.3.0]",
 scala.util;version="[2.9.0.1,2.9.3.0]",
No there are, too alternatives to build. I choose to add the source folder in my build.properties and exclude the .scala files in my maven pom. The alternative is described here.
We need the maven scala plugin. Add the repository
...
 <repository>
  <id>scala-tools.org</id>
  <name>Scala-tools Maven2 Repository</name>
  <url>http://scala-tools.org/repo-releases</url>
 </repository>
...
 <pluginRepository>
  <id>scala-tools.org</id>
  <name>Scala-tools Maven2 Repository</name>
  <url>http://scala-tools.org/repo-releases</url>
 </pluginRepository>
and to our root pom.xml we add the maven-scala-plugin
<plugin>
 <groupId>org.scala-tools</groupId>
 <artifactId>maven-scala-plugin</artifactId>
 <version>2.15.0</version>
 <executions>
  <execution>
   <id>compile</id>
   <goals>
    <goal>compile</goal>
   </goals>
   <phase>compile</phase>
  </execution>
 
  <execution>
   <id>test-compile</id>
   <goals>
    <goal>testCompile</goal>
   </goals>
   <phase>test-compile</phase>
  </execution>
 
  <execution>
   <phase>process-resources</phase>
   <goals>
    <goal>compile</goal>
   </goals>
  </execution>
 </executions>
</plugin>
There is actually an easier version, but which doesn’t work with circular dependencies.
If you have added the src/main/scala folder in your build.properties, than you have to add another plugin, to prevent tycho from exporting all scala source files.
<plugin>
 <groupId>org.eclipse.tycho</groupId>
 <artifactId>tycho-compiler-plugin</artifactId>
 <version>${tycho.version}</version>
 <configuration>
  <excludeResources>
   <excludeResource>**/*.scala</excludeResource>
  </excludeResources>
 </configuration>
</plugin>
Now the build should work with scala, too.

Setting up the project – APT code generation with Eclipse Sapphire

I’m creating some models with Eclipse Sapphire which uses Java Annotation Processing (APT) to generate the models. Apt-maven-plugin is a maven allows us to trigger a processing factory during the build process. The current version alpha-04 has a bug which leads to an error with java 7. So, before we can use this plugin you have to checkout the source code and build the latest alpha-05 version as it’s not released at the moment. Install it in your local maven repository.
Now you can add the apt-maven-plugin to your plugin which needs apt. This could look like
<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
 
<parent>
 <groupId>de.lmu.ifi.dbs.knowing</groupId>
 <artifactId>Knowing</artifactId>
 <version>0.1.4-SNAPSHOT</version>
</parent>
 
<artifactId>de.lmu.ifi.dbs.knowing.core</artifactId>
<packaging>eclipse-plugin</packaging>
 
<build>
 <plugins>
  <plugin>
   <groupId>org.codehaus.mojo</groupId>
   <artifactId>apt-maven-plugin</artifactId>
   <version>1.0-alpha-5-SNAPSHOT</version>
   <executions>
    <execution>
     <goals>
      <goal>process</goal>
     </goals>
    </execution>
   </executions>
   <configuration>
  <factory>org.eclipse.sapphire.sdk.build.processor.internal.APFactory</factory>
   </configuration>
  </plugin>
 </plugins>
</build>
</project>
At last you have  to add the factory as optional dependencies to your MANIFEST.MF of your plugin using apt.
org.eclipse.sapphire.sdk;bundle-version="[0.4.0,0.5.0)";resolution:=optional,
org.eclipse.sapphire.sdk.build.processor;bundle-version="[0.4.0,0.5.0)";resolution:=optional
I you trigger the build, you will see that your apt sources are generated in target/generated-sources/apt. However the files are not compiled. At first I tried the maven-build-helper, but tycho seems to override these settings. So i added target/generated-sources/apt to the build.properties of the plugin using apt, which seems for my as a bad work-around. However it works fine.

Source Code

You can find the code in my github repository.

Conclusion

For a beginner it was not that easy to avoid all little traps with tycho, scala, maven apt. But in the end I hope to safe a lot of time when building and testing.

Things to add

The tutorial doesn’t include any testing.

Links

https://github.com/muuki88/tycho
http://wiki.eclipse.org/Tycho/Reference_Card
http://mattiasholmqvist.se/2010/02/building-with-tycho-part-1-osgi-bundles/
https://github.com/misto/Scala-Hello-World-Plug-in
Compiling circular dependent java-scala classes
Eclipse sapphire and tycho
compile generated sources
http://mojo.codehaus.org/apt-maven-plugin/
APT M2E Connector
Publish pre-compiled bundles in p2 repository

Akka and OSGi development in Eclipse

This short tutorial is about how to run akka in an OSGi environment. I faced
a lot of problems deploying in this in plain eclipse without maven, bnd or sbt.

This example is done with the java-API, however it is also possible with Scala.

Requirements

  • Eclipse Helios 3.6.2 with Scala-Plugin
  • akka-1.1-modules distribution

Configuration

First we have to do some minor changes in some Manifest files in the akka project.

  1. Extract akka-modules-1.1.zip, e.g ~/akka
  2. go to akka/lib_managed/compile
  3. open akka-actor-1.1.jar -> META-INF/MANIFEST.MF
  4. delete following line: private-package: *
  5. Do the same with akka-typed-actor-1.1.jar

Second you have to setup a target-platform which is used to run the OSGi environment.

  1. Go to windows->Preferences->Plugin development->Target Platform
  2. Add target platform, use default
  3. Extract your akka-modules-1.1.zip, e.g ~/akka
You need the follow plugins:
  1. guice-all-2.0.jar
  2. logback-classic-0.9.24.jar
  3. logback-core-0.9.24.jar
  4. slf4j-api-1.6.0.jar
  5. Aspectwerkz by Jonas Bonér

The bundle

Create a new plugin project. No contributions to the UI and an activator class.

Copy the following libs into your bundle and add them to your classpath in MANIFEST.MF

  • akka-actor-1.1.jar
  • akka-typed-actor-1.1.jar
  • akka-slf4j-1.1.jar

Create a class MyActor

import akka.actor.UntypedActor;
 
public class MyActor extends UntypedActor {
 
	@Override
	public void onReceive(Object msg) throws Exception {
		System.out.println("Message: " + msg);
	}
 
}

Add these lines to your Activator class.

 

import akka.actor.ActorRef;
import akka.actor.Actors;
 
//...
 
	public void start(BundleContext bundleContext) throws Exception {
		Activator.context = bundleContext;
		ActorRef actor = Actors.actorOf(MyActor.class).start();
		actor.sendOneWay("Hello You");
	}

At last you have to edit the MANIFEST.MF.  It should look something like this. (I know
I may have to the smallest set of import scala packages).

Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Core
Bundle-SymbolicName: de.lmu.ifi.dbs.knowing.core;singleton:=true
Bundle-Version: 1.0.0.qualifier
Bundle-Activator: de.lmu.ifi.dbs.knowing.core.internal.Activator
Require-Bundle: org.eclipse.core.runtime,
 se.scalablesolutions.akka.actor;bundle-version="1.0.0",
 se.scalablesolutions.akka.stm;bundle-version="1.0.0",
 se.scalablesolutions.akka.typed.actor;bundle-version="1.0.0"
Bundle-ActivationPolicy: lazy
Bundle-RequiredExecutionEnvironment: JavaSE-1.6
Bundle-ClassPath: .
Import-Package: scala;version="2.9.0.1",
 scala.collection;version="2.9.0.l",
 scala.collection.generic;version="2.9.0.1",
 scala.collection.immutable;version="2.8.1.final",
 scala.collection.interfaces;version="2.8.1.final",
 scala.collection.mutable;version="2.8.1.final",
 scala.compat;version="2.8.1.final",
 scala.concurrent;version="2.8.1.final",
 scala.concurrent.forkjoin;version="2.8.1.final",
 scala.io;version="2.8.1.final",
 scala.math;version="2.8.1.final",
 scala.mobile;version="2.8.1.final",
 scala.ref;version="2.8.1.final",
 scala.reflect;version="2.8.1.final",
 scala.reflect.generic;version="2.8.1.final",
 scala.runtime;version="2.8.1.final",
 scala.text;version="2.8.1.final",
 scala.util;version="2.8.1.final",
 scala.util.automata;version="2.8.1.final",
 scala.util.continuations;version="2.8.1.final",
 scala.util.control;version="2.8.1.final",
 scala.util.grammar;version="2.8.1.final",
 scala.util.matching;version="2.8.1.final",
 scala.util.parsing.ast;version="2.8.1.final",
 scala.util.parsing.combinator;version="2.8.1.final",
 scala.util.parsing.combinator.lexical;version="2.8.1.final",
 scala.util.parsing.combinator.syntactical;version="2.8.1.final",
 scala.util.parsing.combinator.testing;version="2.8.1.final",
 scala.util.parsing.combinator.token;version="2.8.1.final",
 scala.util.parsing.input;version="2.8.1.final",
 scala.util.parsing.json;version="2.8.1.final",
 scala.util.parsing.syntax;version="2.8.1.final",
 scala.util.regexp;version="2.8.1.final"

Now let’s run this!

Launch configuration

  1. Open Run->Launch configurtions.
  2. Create a new OSGi Launch configuration
  3. Add the following bundles
    1. org.scala-ide.scala.library (2.8.1) (the akka scala library didn’t work for me)
    2. se.scalablesolutions.akka.actor
    3. se.scalablesolutions.akka.osgi.dependencies.bundle
    4. se.scalablesolutions.akka.actor.typed.actor
    5. se.scalablesolutions.akka.actor.stm
    6. com.google.inject
    7. Equinox Runtime Components (e,g eclipse.runtime.core,..)
  4. Try to launch

 

Hope this works for you, too!

Eclipse Gemini JPA Tutorial

After my test I will start writing a tutorial with a sample application for the Eclipse Gemini Project.

Currently you can find the checkout the SVN Repository under:
https://svn.cip.ifi.lmu.de/~seilern/svn/org.eclipse.gemini.jpa

Good luck,
Muki

UI Extension via Extension Points in Eclipse RCP

Eclipse has a powerful mechanism to allow Plugins to contribute to the UI: Extensions and Extension Points. There are a lot of excellent tutorials like Eclipse Extensions by Lars Vogel on the internet. However this little tutorial is about how to contribute  to an Editor (in this case an additional TabItem).

1. The Extension Interface

First we have to create an Interface which the Extension has to implement. To create an additional Tab in an Editor I created an Interface like this:

public interface IEditorTabExtension {
 
	/**
	 * Is called to create the tab control
	 * @param parent
	 * @return Control - The created Control
	 */
	public Control createContents(Composite parent);
 
	/**
	 * Should be called by the doSave method in
	 * the root EditorPart
	 *
	 * @param monitor
	 */
	public void doSave(IProgressMonitor monitor);
 
	/**
	 * Call-by-Reference dirty boolean. Indicates
	 * if changes were made.
	 *
	 * @param dirty
	 */
	public void setDirty(Boolean dirty);
 
	/**
	 *
	 * @return Name for the Tab
	 */
	public String getName();
}

2. Create the Extension Point

First we create an Extension Point in the plugin.xml via the plugin.xml Editor.

Create Extension Point

The Extension-Schema Editor should now open automatically. Otherwise there’s a button.
Add a new Element and call it “tab”. Now add a new attribute and name it “class”. Type should be “java” and Implements
our IEditorTabExtension. Don’t forget to create a new attribute Choice in the “extension” element. And in there an
“Tab” entry. Now it should look like this:

Extension Point Elements

3. Create an Extension and provide it

Our Plugin can not only provide an extension point, it provides an extension too. Feel free to
implement the Interface with an UI you like. To register this Extension open the plugin.xml
and the Extensions Tab. Add our new Extension Point de.mukis.editor.EditorTabExtension.
Should look like this:

Provide Extension

4. Evaluate Contribs and add it to the Editor

private IEditorTabExtension[] extensions;
 
	@Override
	public void doSave(IProgressMonitor monitor) {
		dirty = false;
		for(IEditorTabExtension e : extensions)
			e.doSave(monitor);
		firePropertyChange(IWorkbenchPartConstants.PROP_DIRTY);
	}
 
	@Override
	public void createPartControl(Composite parent) {
		folder = new TabFolder(parent, SWT.BORDER);
 
		extensions = evaluateTabContribs();
		for (IEditorTabExtension e : extensions) {
			TabItem tab = new TabItem(folder, SWT.BORDER);
			tab.setText(e.getName());
			tab.setControl(e.createContents(folder));
			System.out.println("Tab added");
		}
 
	}
 
 private IChildEditorTabExtension[] evaluateTabContribs() {
		IConfigurationElement[] config = Platform.getExtensionRegistry()
				.getConfigurationElementsFor(TAB_ID);
		final LinkedList list = new LinkedList();
		try {
			for(IConfigurationElement e : config) {
				System.out.println("Evaluation extension");
				final Object o = e.createExecutableExtension("class");
				if(o instanceof IEditorTabExtension) {
					ISafeRunnable runnable = new ISafeRunnable() {
 
						@Override
						public void handleException(Throwable exception) {
							System.out.println("Exception in Tab");
						}
 
						@Override
						public void run() throws Exception {
							IEditorTabExtension tab = (IEditorTabExtension)o;
							list.add(tab);
							System.out.println("Extension detected: " + tab.getName());
						}
					};
					SafeRunner.run(runnable);
				}
			}
		} catch(CoreException ex) {
			System.out.println(ex.getMessage());
		}
		return list.toArray(new IChildEditorTabExtension[list.size()]);
	}

This is very basic. The isDirty flag solution isn’t very smart. We use the Call-by-Reference effect
to provide a “global” Boolean.

Thanks to Lars Vogel’s tutorials which inspired me to to my own stuff  and have been used
for this tutorial.

Using ISharedImages in plugin.xml

Eclipse provides a lot of images for general purpose, e.g. save, delete, edit, folders, etc. You can access these ImageDescriptors via

PlatformUI.getWorkbench().getSharedImages().getImageDescriptor(SharedImages.IMAGE_ID);

However there is a more sophisticated way to create Entries in toolbars and/or menus. This is shown in a great tutorial by Lars Vogel here. You have now a command and a referencing menu extension. To use the Eclipse SharedImages you just put the ISharedImages constant in the icon field, like

icon: IMG_ETOOL_SAVEALL_EDIT

It works fine for me in Eclipse 3.6 Helios.  The article is based on https://bugs.eclipse.org/bugs/show_bug.cgi?id=246224