Archiv nach Schlagworten: scala

Playframework and RequireJS

RequireJS Logo

As a backend developer I like tools that help me to structure my code. Doing more and more frontend stuff I finally got time to learn some of the basics of RequireJS. Unfortunately the tutorials how to compose playframework and requireJS with multipage applications is not too big. There’s is some with AngularJS, but I didn’t want to port my applications to two new systems.

Application structure

For a sample application, I implemented to pages:

Both will have their own data-entry-point and dependencies. The index page looks like this

@(message: String)
 
@main("RequireJS with Play") {
    // html here
 
 @helper.requireJs(core = routes.Assets.at("javascripts/require.js").url,
                   module = routes.Assets.at("javascripts/main/main").url)
 
}

The routes file is very basic, too:

GET    /                controllers.Application.index
GET    /dashboard       controllers.Application.dashboard
POST   /api/sample      controllers.Application.sample
 
### Additions needed
GET    /jsroutes.js     controllers.Application.jsRoutes()
### Enable www.WebJars.org based resources to be returned
GET    /webjars/*file   controllers.WebJarAssets.at(file)
GET    /assets/*file    controllers.Assets.at(path="/public", file)

The javascript folder layout

  • assets/javascripts
    • common.js
    • main.js
    • dashboard
      • chart.js
      • main.js
    • lib
      • math.js

How does it work?

First you define a file common.js, which is used to configure requirejs.

(function(requirejs) {
    "use strict";
 
    requirejs.config({
        baseUrl : "/assets/javascripts",
        shim : {
            "jquery" : {
                exports : "$"
            },
            "jsRoutes" : {
                exports : "jsRoutes"
            }
        },
        paths : {
            "math" : "lib/math",
            // Map the dependencies to CDNs or WebJars directly
            "_" : "//cdnjs.cloudflare.com/ajax/libs/underscore.js/1.5.1/underscore-min",
            "jquery" : "//localhost:9000/webjars/jquery/2.0.3/jquery.min",
            "bootstrap" : "//netdna.bootstrapcdn.com/bootstrap/3.0.0/js/bootstrap.min",
            "jsRoutes" : "//localhost:9000/jsroutes"
        // A WebJars URL would look like
        // //server:port/webjars/angularjs/1.0.7/angular.min
        }
    });
 
    requirejs.onError = function(err) {
        console.log(err);
    };
})(requirejs);

The baseUrl is important, as this will be the root path from now on. IMHO this makes things easier than, relative paths.

The shim configuration is used to export your jsRoutes, which is defined in my Application.scala file. Of course you can add as many as you want.

The paths section is a bit tricky. Currently it seems there’s no better way than hardcoding the urls, like “jsRoutes” : “//localhost:9000/jsroutes”, when you use WebJars.

Define and Require

Ordering is crucial! For my /dasbhoard page the /dasbhoard/main.js is my entry point

// first load the configuration
require(["../common"], function(common) {
   console.log('Dashboard started');
 
   // Then load submodules. Remember the baseUrl is set:
   // Even you are in the dasboard folder you have to reference dashboard/chart
   // directly
   require(["jquery", "math", "dashboard/chart"], function($, math, chart){
       console.log("Title is : " + $('h1').text());
       console.log("1 + 3 = " + math.sum(1,3));
       console.log(chart);
 
       chart.load({ page : 'dashboard'}, function(data){
           console.log(data);
       }, function(status, xhr, error) {
           console.log(status);
       });
 
   });
});

For the chart.js

// first the configuration, then other dependencies
define([ "../common", "jsRoutes" ], {
    load : function(data, onSuccess, onFail) {
        var r = jsRoutes.controllers.Application.sample();
        r.contentType = 'application/json';
        r.data = JSON.stringify(data);
        $.ajax(r).done(onSuccess).fail(onFail);
    }
})

Links

Future Composition with Scala and Akka

Future Composition with Scala and Akka

Scala is functional and object-oriented language, which runs on the JVM. For concurrent and/or parallel programming it is a suitable choice along with the Akka framework, which provides a rich toolset for all kind of concurrent tasks. In this post I want to show a little example how to schedule a logfile-search job on multiple files/servers with Futures and Actors.

Setup

I created my setup with the Typesafe Activator Hello-Akka template. This results in a build.sbt file with the following content:

name := """hello-akka"""
 
version := "1.0"
 
scalaVersion := "2.10.2"
 
libraryDependencies ++= Seq(
  "com.typesafe.akka" %% "akka-actor" % "2.2.0",
  "com.typesafe.akka" %% "akka-testkit" % "2.2.0",
  "com.google.guava" % "guava" % "14.0.1",
  "org.scalatest" % "scalatest_2.10" % "1.9.1" % "test",
  "junit" % "junit" % "4.11" % "test",
  "com.novocode" % "junit-interface" % "0.7" % "test->default"
)
 
testOptions += Tests.Argument(TestFrameworks.JUnit, "-v")

Scala build-in Futures

Scala has already a build-in support for Futures. The implementation is based on java.util.concurrent. Let’s implement a Future which runs our log search.

import scala.concurrent._
import scala.concurrent.duration._
import scala.concurrent.ExecutionContext.Implicits._
 
object LogSearch extends App {
 
println("Starting log search")
 
val searchFuture = future {
  Thread sleep 1000
  "Found something"
}
 
println("Blocking for results")
  val result = Await result (searchFuture, 5 seconds)
  println(s"Found $result")
}

This is all we need to run our task in another thread. The implicit import from ExecutionContext provides a default ExecutionContext which handles the threads the future is running on. After creating the future we wait with a blocking call Await result for our results. So far nothing too fancy.

Future composition

There are a lot of examples where the for-yield syntax is used to compose future results. In our case we have a dynamic list of futures: the log search results from each server.

For testing future capabilities we will create a list of futures from a list of ints which represent the time the task will run. Types are just for clarification.

val tasks = List(3000, 1200, 1800, 600, 250, 1000, 1100, 8000, 550)
val taskFutures: List[Future[String]] = tasks map { ms =>
  future {
    Thread sleep ms
    s"Task with $ms ms"
  }
}

In the end, we want a List[String] as a result. This is done with the Futures companion object.

val searchFuture: Future[List[String]] = Future sequence taskFutures

And finally we can wait for our results with

val result = Await result (searchFuture, 2 seconds)

However this will throw a TimeoutException, as some of our tasks run more than 2 seconds. Of course we could increase the timeout, but there error could always happen again, when a server is down. Another approach would be to handle the exception and return an error. However all other results would be lost.

Future – Timeout fallback

No problem, we generate a fallback, which will return a default value if the operation takes, too long. A very naive implementation for our fallback could look like this

def fallback[A](default: A, timeout: Duration): Future[A] = future {
  Thread sleep timeout.toMillis
  default
}

The fallback future will return after the executing thread has sleeped for the timeout duration. The calling code now looks like this.

val timeout = 2 seconds
val tasks = List(3000, 1200, 1800, 600, 250, 1000, 1100, 8000, 550)
val taskFutures: List[Future[String]] = tasks map { ms =>
val search = future {
  Thread sleep ms
  s"Task with $ms ms"
}
 
Future firstCompletedOf Seq(search,
  fallback(s"timeout $ms", timeout))
}
 
val searchFuture: Future[List[String]] = Future sequence taskFutures
 
println("Blocking for results")
val result = Await result (searchFuture, timeout * tasks.length)
println(s"Found $result")

The important call here is Future firstCompletedOf Seq(..) which produces a future returning the result of the first finished future.

This implementation is very bad as discussed here. In short: We are wasting CPU time by putting threads to sleep. Also the blocking call timeout is more or less a guess. With a one-thread scheduler it can actually take more time.

Futures and Akka

Now let’s do this more performant and more robust. Our main goal is to get rid of the poor fallback implementation, which was blocking a complete thread. The idea is now to schedule the fallback feature after a given duration. By this you have all threads working on real, while the fallback future execution time is almost zero. Java has a ScheduledExecutorService on it’s own or you can use a different implementation, a HashedWheelTimer, by Netty. Akka used to use the HashWheelTimer, but has now a own implementation.

So let’s start with the actor.

import akka.actor._
import akka.pattern.{ after, ask, pipe }
import akka.util.Timeout
 
class LogSearchActor extends Actor {
 
  def receive = {
    case Search(worktimes, timeout) =>
      // Doing all the work in one actor using futures
      val searchFutures = worktimes map { worktime =>
      val searchFuture = search(worktime)
      val fallback = after(timeout, context.system.scheduler) {
          Future successful s"$worktime ms > $timeout" 
        }
        Future firstCompletedOf Seq(searchFuture, fallback)
      }
 
      // Pipe future results to sender
      (Future sequence searchFutures) pipeTo sender
    }
 
  def search(worktime: Int): Future[String] = future {
      Thread sleep worktime
      s"found something in $worktime ms"
  }
}
 
case class Search(worktime: List[Int], timeout: FiniteDuration)

The important part is the after method call. You give it a duration after which the future should be executed and as a second parameter the scheduler, which is the default one of the actor system in our case. The third parameter is the future which should get executed. I use the Future success companion method to return a single string.

The rest of the code is almost identical. PipeTo is a akka pattern to return results of a future to the sender. Nothing fancy here.

Now how to call all this. First the code

object LogSearch extends App {
 
println("Starting actor system")
val system = ActorSystem("futures")
 
println("Starting log search")
try {
  // timeout for each search task
  val fallbackTimeout = 2 seconds
 
  // timeout use with akka.patterns.ask
  implicit val timeout = new Timeout(5 seconds)
 
  require(fallbackTimeout < timeout.duration) 
 
  // Create SearchActor 
  val search = system.actorOf(Props[LogSearchActor]) 
 
  // Test worktimes for search 
  val worktimes = List(1000, 1500, 1200, 800, 2000, 600, 3500, 8000, 250) 
 
  // Asking for results 
  val futureResults = (search ? Search(worktimes, fallbackTimeout)) 
    // Cast to correct type 
    .mapTo[List[String]] 
    // In case something went wrong 
    .recover { 
       case e: TimeoutException => List("timeout")
       case e: Exception => List(e getMessage)
  }
  // Callback (non-blocking)
  .onComplete {
      case Success(results) =>
         println(":: Results ::")
         results foreach (r => println(s" $r"))
         system shutdown ()
      case Failure(t) =>
         t printStackTrace ()
      system shutdown ()
  }
 
} catch {
  case t: Throwable =>
  t printStackTrace ()
  system shutdown ()
}
 
  // Await end of programm
  system awaitTermination (20 seconds)
}

The comments should explain most of the parts. This example is completly asynchronous and works with callbacks. Of course you can use the Await result call as before.

Links

https://gist.github.com/muuki88/6099946
http://doc.akka.io/docs/akka/2.1.0/scala/futures.html
http://stackoverflow.com/questions/17672786/scala-future-sequence-and-timeout-handling
http://stackoverflow.com/questions/16304471/scala-futures-built-in-timeout

Maven – Tycho, Java, Scala and APT

This tutorial shows a small project which is build with maven-tycho and the following requirements:

  • Mixed Java / Scala project
  • Eclipse plugin deployment
  • Eclipse Annotation Processing (APT)
  • Manifest-first approach
  • Java 7 / Scala 2.9.2
That doesn’t sound too hard. In fact it isn’t, if you are familiar with maven and how tycho works. 

Setting up maven

First download maven 3 and configure it.
I created two profiles in my settings.xml and added some repositories.
My two profiles are tycho-build and scala-build which are activated with
the corresponding property present.
<settings>
 <profiles>
  <profile>
   <id>tycho</id>
   <activation>
    <activeByDefault>false</activeByDefault>
    <property>
     <name>tycho-build</name>
    </property>
  </activation>
  <repositories>
   <repository>
    <id>eclipse-indigo</id>
    <layout>p2</layout>
    <url>http://download.eclipse.org/releases/indigo</url>
   </repository>
   <repository>
    <id>eclipse-sapphire</id>
    <layout>p2</layout>
    <url>http://download.eclipse.org/sapphire/0.4.1/repository</url>
   </repository>
   <repository>
    <id>eclipse-scala-ide</id>
    <layout>p2</layout>
   <url>http://download.scala-ide.org/releases-29/milestone/site</url>
  </repository>
  <repository>
   <id>eclipse-gemini-dbaccess</id>
   <layout>p2</layout>
   <url>http://download.eclipse.org/gemini/dbaccess/updates/1.0</url>
   </repository>
  </repositories>
 </profile>
 
 <profile>
  <id>scala</id>
  <activation>
   <activeByDefault>false</activeByDefault>
    <property>
     <name>scala-build</name>
    </property>
   </activation>
  <repositories>
   <repository>
    <id>scala-tools.org</id>
    <name>Scala-tools Maven2 Repository</name>
    <url>http://scala-tools.org/repo-releases</url>
   </repository>
   <repository>
    <id>typesafe</id>
    <name>Typesafe Repository</name>
    <url>http://repo.typesafe.com/typesafe/releases/</url>
   </repository>
  </repositories>
 <pluginRepositories>
  <pluginRepository>
    <id>scala-tools.org</id>
    <name>Scala-tools Maven2 Repository</name>
    <url>http://scala-tools.org/repo-releases</url>
   </pluginRepository>
  </pluginRepositories>
 </profile>
</profiles>
</settings>

Setting up the project – The tycho build

For my project I just used two simple plugins. Nothing fancy here.
  1. Create plugin-project
  2. Add some dependencies
  3. Write some classes in Java
I recommend the following project structure
root-project/
 plugin.core
 plugin.ui
 plugin.xy
go to your root-project folder in your favorite console and use the following command to generate pom.xml with tycho.
mvn org.sonatype.tycho:maven-tycho-plugin:generate-poms -DgroupId=de.mukis -Dtycho.targetPlatform=path/to/target/platform/
which generates a first project for you. A few things to “tweak” as I saw it as a best-practice in most of the other tutorials:
  • Replace all concrete version numbers with property placeholders, e.g 0.12.0 with ${tycho.version}
  • Remove all groupId and version tags in the pom.xml. The parent pom.xml will generate these.
  • Check your folder structure. Tycho infers AND changes your source directory according to your build.properties.
Next add the p2 repositories needed to resolve all dependencies. This is done via the <repository> tag. The full pom.xml is at the end.
Sometimes you have existing OSGi bundles but no p2 repository you can use it. Eclipse PDE has a nice extra feature for you. Features and bundles publisher application. Note: It’s very important that your repository folder has two folder plugins and features.
Now you can run your maven build with
mvn clean package
and you will get a nice packaged osgi bundle.

Setting up the project – The scala build

So now we want to add some Scala classes. Create new source folder src/main/scala and create some classes. Don’t forget to import Scala packages. So your MANIFEST.MF contains something like:
Import-Package: org.osgi.framework;version="1.6.0",
 scala;version="[2.9.0.1,2.9.3.0]",
 scala.collection;version="[2.9.0.1,2.9.3.0]",
 scala.collection.generic;version="[2.9.0.1,2.9.3.0]",
 scala.collection.immutable;version="[2.9.0.1,2.9.3.0]",
 scala.collection.interfaces;version="[2.9.0.1,2.9.3.0]",
 scala.collection.mutable;version="[2.9.0.1,2.9.3.0]",
 scala.collection.parallel;version="[2.9.0.1,2.9.3.0]",
 scala.collection.parallel.immutable;version="[2.9.0.1,2.9.3.0]",
 scala.collection.parallel.mutable;version="[2.9.0.1,2.9.3.0]",
 scala.concurrent;version="[2.9.0.1,2.9.3.0]",
 scala.concurrent.forkjoin;version="[2.9.0.1,2.9.3.0]",
 scala.io;version="[2.9.0.1,2.9.3.0]",
 scala.math;version="[2.9.0.1,2.9.3.0]",
 scala.parallel;version="[2.9.0.1,2.9.3.0]",
 scala.ref;version="[2.9.0.1,2.9.3.0]",
 scala.reflect,
 scala.reflect.generic;version="[2.9.0.1,2.9.3.0]",
 scala.runtime;version="[2.9.0.1,2.9.3.0]",
 scala.text;version="[2.9.0.1,2.9.3.0]",
 scala.util;version="[2.9.0.1,2.9.3.0]",
No there are, too alternatives to build. I choose to add the source folder in my build.properties and exclude the .scala files in my maven pom. The alternative is described here.
We need the maven scala plugin. Add the repository
...
 <repository>
  <id>scala-tools.org</id>
  <name>Scala-tools Maven2 Repository</name>
  <url>http://scala-tools.org/repo-releases</url>
 </repository>
...
 <pluginRepository>
  <id>scala-tools.org</id>
  <name>Scala-tools Maven2 Repository</name>
  <url>http://scala-tools.org/repo-releases</url>
 </pluginRepository>
and to our root pom.xml we add the maven-scala-plugin
<plugin>
 <groupId>org.scala-tools</groupId>
 <artifactId>maven-scala-plugin</artifactId>
 <version>2.15.0</version>
 <executions>
  <execution>
   <id>compile</id>
   <goals>
    <goal>compile</goal>
   </goals>
   <phase>compile</phase>
  </execution>
 
  <execution>
   <id>test-compile</id>
   <goals>
    <goal>testCompile</goal>
   </goals>
   <phase>test-compile</phase>
  </execution>
 
  <execution>
   <phase>process-resources</phase>
   <goals>
    <goal>compile</goal>
   </goals>
  </execution>
 </executions>
</plugin>
There is actually an easier version, but which doesn’t work with circular dependencies.
If you have added the src/main/scala folder in your build.properties, than you have to add another plugin, to prevent tycho from exporting all scala source files.
<plugin>
 <groupId>org.eclipse.tycho</groupId>
 <artifactId>tycho-compiler-plugin</artifactId>
 <version>${tycho.version}</version>
 <configuration>
  <excludeResources>
   <excludeResource>**/*.scala</excludeResource>
  </excludeResources>
 </configuration>
</plugin>
Now the build should work with scala, too.

Setting up the project – APT code generation with Eclipse Sapphire

I’m creating some models with Eclipse Sapphire which uses Java Annotation Processing (APT) to generate the models. Apt-maven-plugin is a maven allows us to trigger a processing factory during the build process. The current version alpha-04 has a bug which leads to an error with java 7. So, before we can use this plugin you have to checkout the source code and build the latest alpha-05 version as it’s not released at the moment. Install it in your local maven repository.
Now you can add the apt-maven-plugin to your plugin which needs apt. This could look like
<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
 
<parent>
 <groupId>de.lmu.ifi.dbs.knowing</groupId>
 <artifactId>Knowing</artifactId>
 <version>0.1.4-SNAPSHOT</version>
</parent>
 
<artifactId>de.lmu.ifi.dbs.knowing.core</artifactId>
<packaging>eclipse-plugin</packaging>
 
<build>
 <plugins>
  <plugin>
   <groupId>org.codehaus.mojo</groupId>
   <artifactId>apt-maven-plugin</artifactId>
   <version>1.0-alpha-5-SNAPSHOT</version>
   <executions>
    <execution>
     <goals>
      <goal>process</goal>
     </goals>
    </execution>
   </executions>
   <configuration>
  <factory>org.eclipse.sapphire.sdk.build.processor.internal.APFactory</factory>
   </configuration>
  </plugin>
 </plugins>
</build>
</project>
At last you have  to add the factory as optional dependencies to your MANIFEST.MF of your plugin using apt.
org.eclipse.sapphire.sdk;bundle-version="[0.4.0,0.5.0)";resolution:=optional,
org.eclipse.sapphire.sdk.build.processor;bundle-version="[0.4.0,0.5.0)";resolution:=optional
I you trigger the build, you will see that your apt sources are generated in target/generated-sources/apt. However the files are not compiled. At first I tried the maven-build-helper, but tycho seems to override these settings. So i added target/generated-sources/apt to the build.properties of the plugin using apt, which seems for my as a bad work-around. However it works fine.

Source Code

You can find the code in my github repository.

Conclusion

For a beginner it was not that easy to avoid all little traps with tycho, scala, maven apt. But in the end I hope to safe a lot of time when building and testing.

Things to add

The tutorial doesn’t include any testing.

Links

https://github.com/muuki88/tycho
http://wiki.eclipse.org/Tycho/Reference_Card
http://mattiasholmqvist.se/2010/02/building-with-tycho-part-1-osgi-bundles/
https://github.com/misto/Scala-Hello-World-Plug-in
Compiling circular dependent java-scala classes
Eclipse sapphire and tycho
compile generated sources
http://mojo.codehaus.org/apt-maven-plugin/
APT M2E Connector
Publish pre-compiled bundles in p2 repository