Tuesday, July 8, 2014

JSF Primefaces File upload (and commons library), what every developer should know

If you google for examples about uploading files in JSF with Primefaces, you will find a lot of pages indicating how to do that. Unfortunately, almost all of them lack some crucial details.
In this post, we will see some important points about uploading files with Primefaces. And I will assume you are able to implement a simple file upload.

PS: please refer to the many available resources on how to implement a simple file upload. You can use this link.

1. Choosing the right "thresholdSize" value

During the upload operation, uploaded files are either kept in server memory or in temporary files, until you finish processing them. 
The decision is made based on the file size compared to a property called sizeThreshold ("The threshold above which uploads will be stored on disk", see DiskFileItemFactory for details).
You can specify the value of sizeThreshold in web.xml as an init-param to the Primefaces fileupload filter (like in the above mentioned article). However you must chose the right value, based on your application usage. If you define a large value (10Mb for example), and there are 100 users that can access your application to upload files in the same time, than you will need up to 1Gb of memory just to handle file uploads. Otherwise, you will encounter OutOfMemory errors, and you will need to increase the max heap memory every time the number of users is growing.

2. Make sure your file names are correctly processed


If you store your files after the upload like in the mentioned example, and then try to upload a file name with a name containing some French characters like (à, é etc...) or even some Arabic names,  you will get some weird results like in the following picture:



The first file, had for name: ààààà.txt. The second had an Arabic name.

The solution will be to decode the filename using the UTF8 charset, here is how to do it:


public void handleFileUpload(FileUploadEvent event) {
  //import org.apache.commons.io.FilenameUtils; to use it
  //In most cases FileUploadEvent#getFile()#getFileName() returns the base file name, 
  //without path information. However, some clients, such as 
  //the Opera browser, do include path information
  //From the FileItem class docs
  String fileName = FilenameUtils.getName(event.getFile().getFileName());
  try {
   fileName = new String(fileName.getBytes(Charset.defaultCharset()), "UTF-8");
  } catch (UnsupportedEncodingException e1) {
   LOGGER.error("Error in charset:",e1);
  }
}

Now you can try to upload files with special characters and check that everything works fine.

3. Make sure the temporary files are deleted


As mentioned in the FileUpload docs, when using DiskFileItem to upload files (which means you are writing uploaded files to temp ones on the disk), a resource cleanup should be applied. To do so (refer to the paragraph Resource cleanup) "an instance of org.apache.commons.io.FileCleaningTracker must be used when creating a org.apache.commons.fileupload.disk.DiskFileItemFactory". 
Unfortunately, this is not implemented in the Primefaces FileUploadFilter (see this issue). So until some future releases come with a fix, you need to define a new filter extending from FileUploadFilter and change the doFilter to the following:

public void doFilter(ServletRequest request, ServletResponse response, FilterChain filterChain) throws IOException, ServletException {
        if(bypass) {
            filterChain.doFilter(request, response);
            return;
        }
        
        HttpServletRequest httpServletRequest = (HttpServletRequest) request;
  boolean isMultipart = ServletFileUpload.isMultipartContent(httpServletRequest);
  
  if(isMultipart) {
   logger.debug("Parsing file upload request");
   //This line is added
   FileCleaningTracker fileCleaningTracker = FileCleanerCleanup.getFileCleaningTracker(request.getServletContext());
   DiskFileItemFactory diskFileItemFactory = new DiskFileItemFactory();
   //This line is added
   diskFileItemFactory.setFileCleaningTracker(fileCleaningTracker);
   if(thresholdSize != null) {
    diskFileItemFactory.setSizeThreshold(Integer.valueOf(thresholdSize));
   }
   if(uploadDir != null) {
    diskFileItemFactory.setRepository(new File(uploadDir));
   }
    
   ServletFileUpload servletFileUpload = new ServletFileUpload(diskFileItemFactory);
   MultipartRequest multipartRequest = new MultipartRequest(httpServletRequest, servletFileUpload);
   
   logger.debug("File upload request parsed succesfully, continuing with filter chain with a wrapped multipart request");
   
   filterChain.doFilter(multipartRequest, response);
  } 
        else {
   filterChain.doFilter(request, response);
  }
}
And of course don't forget the FileCleanerCleanup listener to your web.xml so it stops created threads:


<listener>
   <listener-class>
   org.apache.commons.fileupload.servlet.FileCleanerCleanup
   </listener-class>
</listener>
Now according to FileUpload docs, temps files will be deleted automatically when the corresponding java.io.File instance is no longer used and garbage collected.
Unfortunately, that didn't work for me under Tomcat 7. And may be you will face the same issue, or even, you want to ensure that this deletion is done at an earlier time, thus preserving system resources.
The best solution that I found to manually delete these temp files, was to add a new method to the org.primefaces.model.UploadedFile to return the corresponding instance of org.apache.commons.fileupload.FileItem and then delete it calling its delete() method.
You don't need to get the Primefaces source code and to build it manually, but just create a org.apache.commons.fileupload and add the mentioned classes there. The classloader will load your classes instead of those coming with Primefaces.

4. Make sure you are copying files efficiently


In many examples I saw people saving files by calling the getContents() of the UploadedFile instance. This is really a bad practice since it loads whole content of the file into the server memory. Imagine you are writing 1000 files of 10Mb size each one simultaneously!
So, unless you are sure your files size are really small, never call that method. Instead you should use the getInputstream one like in the above mentioned article. And even, there is a fatser way to copy content of the file using the Java NIO API (the source of this solution is here):


private void fastFileCopy(UploadedFile file, String filePath){
  try {
   final InputStream input = file.getInputstream();
   final OutputStream output = new FileOutputStream(filePath);
   CustomFileUtils.writeStream(input, output);
   
   //And call the delete method here:
   file.getFileItem().delete();
  } catch (IOException e) {
   LOGGER.error("Error upload,", e);
  }
 }
 
 public static void copyStream(final InputStream input, final OutputStream output)
   throws IOException {
  final ReadableByteChannel inputChannel = Channels.newChannel(input);
  final WritableByteChannel outputChannel = Channels.newChannel(output);
  // copy the channels
  final ByteBuffer buffer = ByteBuffer.allocateDirect(16 * 1024);
  while (inputChannel.read(buffer) != -1) {
   // prepare the buffer to be drained
   buffer.flip();
   // write to the channel, may block
   outputChannel.write(buffer);
   // If partial transfer, shift remainder down
   // If buffer is empty, same as doing clear()
   buffer.compact();
  }
  // EOF will leave buffer in fill state
  buffer.flip();
  // make sure the buffer is fully drained.
  while (buffer.hasRemaining()) {
   outputChannel.write(buffer);
  }
  // closing the channels
  inputChannel.close();
  outputChannel.close();
 }

And that's the best solution regarding speed and performance that I can think of by now.
The source code of this tutorial is available on Github.

I will be happy for any feedback

Monday, June 30, 2014

Spring sample application, environment dependent properties files, Logback, JUnit and Ehcache

In this article we will create a basic maven Spring application. The application will contain following features:

  1. Depending on an environment variable (call it env), the application will specific parameters, e.g: DB URL, schemas, and so on.
  2. The application must write some custom log messages to a particular log file. Let's say that this file will contain specific log messages to be visualised by admin
  3. The application will use Ehcache to cache calls to specific methods
PS: The source code of this article is on github.

1. Create the project with maven

First, let's create a maven web project. To do so, go to project placement (where you want to create the project) and run following command (you must have maven installed, see the link for instructions):

mvn archetype:generate -DarchetypeArtifactId=maven-archetype-webapp

 It will ask you to enter the groupId, the artifactId, the version and the package for the newly generated project.
After that, go to Eclipse (or your favorite IDE) and import the project. In Eclipse, right click in the project explorer and chose Import -> Import... After that, a dialog will be opened:

Chose "Existing Maven Projects" and click Next>.
Now browse to where you executed the mvn command and select the newly created (by maven) project having for name the value of artifact that you gave:



And click Finish.
You will see the newly imported project under your projects explorer in Eclipse:


Now right click on src/main folder and chose New->Folder and name it "java"
After that, right click on src folder and chose New->Folder and name it "test". And add another folder named "java" under the newly created "test" folder. Now you project should look like this:


Now open the pom.xml file and change it so it looks like the following:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.raissi</groupId>
  <artifactId>test-maven-project</artifactId>
  <packaging>war</packaging>
  <version>1.0-SNAPSHOT</version>
  <name>test-maven-project Maven Webapp</name>
  <url>http://maven.apache.org</url>
  <dependencies>
   <!-- Just for tests -->
 <dependency>
  <groupId>javax.servlet</groupId>
  <artifactId>javax.servlet-api</artifactId>
  <version>3.0.1</version>
  <scope>test</scope>
 </dependency>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
  </dependencies>
  <properties>
  <spring.version>4.0.4.RELEASE</spring.version>
  <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>
  <build>
  <finalName>spring-cache-tutorial</finalName>
  <plugins>
   <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-compiler-plugin</artifactId>
    <version>3.1</version>
    <configuration>
     <source>1.7</source>
     <target>1.7</target>
     <encoding>${project.build.sourceEncoding}</encoding>
    </configuration>
   </plugin>
   <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-war-plugin</artifactId>
    <version>2.4</version>
    <configuration>
     <archive>
      <manifestEntries>
       <DisableIBMJAXWSEngine>true</DisableIBMJAXWSEngine>
      </manifestEntries>
     </archive>
    </configuration>
   </plugin>

  </plugins>
 </build>
</project>

You need now to run (right click on the project) Maven->Update Project...
Now the project should be ready to add some code to it.

2. Prepare the Spring application

As said in the introduction, this is a Spring application, so first we need to add the Spring dependenciies to our project. In pom.xml file, add following:

<dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-core</artifactId>
   <version>${spring.version}</version>
   <exclusions>
    <exclusion>
     <groupId>commons-logging</groupId>
     <artifactId>commons-logging</artifactId>
    </exclusion>
   </exclusions>
  </dependency>
  <dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-beans</artifactId>
   <version>${spring.version}</version>
  </dependency>
  <dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-aop</artifactId>
   <version>${spring.version}</version>
  </dependency>
  <dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-context</artifactId>
   <version>${spring.version}</version>
   <exclusions>
    <exclusion>
     <groupId>commons-logging</groupId>
     <artifactId>commons-logging</artifactId>
    </exclusion>
   </exclusions>
  </dependency>
  <!-- Various Application Context utilities, including EhCache, JavaMail, 
   Quartz, and Freemarker integration Define this if you need any of these integrations -->
  <dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-context-support</artifactId>
   <version>${spring.version}</version>
  </dependency>
  <dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-tx</artifactId>
   <version>${spring.version}</version>
  </dependency>
  <dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-jdbc</artifactId>
   <version>${spring.version}</version>
  </dependency>
  <dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-web</artifactId>
   <version>${spring.version}</version>
  </dependency>

  <dependency>
   <groupId>org.aspectj</groupId>
   <artifactId>aspectjweaver</artifactId>
   <version>1.7.4</version>
  </dependency>
  <dependency>
   <groupId>org.aspectj</groupId>
   <artifactId>aspectjrt</artifactId>
   <version>1.7.4</version>
  </dependency>
  <!-- JSR 330 -->
  <dependency>
   <groupId>javax.inject</groupId>
   <artifactId>javax.inject</artifactId>
   <version>1</version>
  </dependency>

Now let's create the applicationContext.xml file to configure Spring: Under src/main/webapp/WEB-INF add a file named applicationContext.xml:

Next, open the file and add the following:

<beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:aop="http://www.springframework.org/schema/aop"
 xmlns:context="http://www.springframework.org/schema/context" xmlns:p="http://www.springframework.org/schema/p"
 xmlns:tx="http://www.springframework.org/schema/tx" xmlns:task="http://www.springframework.org/schema/task"
 xmlns:cache="http://www.springframework.org/schema/cache"
 xsi:schemaLocation="http://www.springframework.org/schema/beans
 http://www.springframework.org/schema/beans/spring-beans-4.0.xsd
 http://www.springframework.org/schema/context
 http://www.springframework.org/schema/context/spring-context-4.0.xsd
 http://www.springframework.org/schema/tx
    http://www.springframework.org/schema/tx/spring-tx-4.0.xsd
    http://www.springframework.org/schema/aop
    http://www.springframework.org/schema/aop/spring-aop-4.0.xsd
    http://www.springframework.org/schema/task
    http://www.springframework.org/schema/task/spring-task-4.0.xsd
    http://www.springframework.org/schema/cache 
    http://www.springframework.org/schema/cache/spring-cache-4.0.xsd">

 <!-- Base package for Spring to look for annotated beans -->
 <context:component-scan base-package="com.raissi" />
 <!-- Activates various annotations to be detected in bean classes: Spring's @Required and @Autowired, as well as JSR 
    250's @PostConstruct, @PreDestroy and @Resource (if available) etc...-->
 <context:annotation-config></context:annotation-config>

</beans>

For now, the file contains only the base package to tell Spring where to find our beans. And also the context:annotation-config to tell Spring that we are using annotations to define our resources.

3. Properties file

By now, we have Spring correctly configured. So let's move to the first requirement of our application.
The goal here is to have multiple properties files, and that our application uses the right one for every environment. 
Let's say you have different servers on different machines, one of these servers is dedicated to DEV teams, another is for TEST teams, and another one is for PRODUCTION. And that you are using Jenkins or another Continuous Integration tool to build and deploy your application.
Evidently, the 3 environments have different values for config parameters like DB URL, username and password etc...
To not be obliged to change these values manually in your properties files, or using maven to change them, we will create three properties files, each one is dedicated to a specific environment:
spring-sample.dev.propertiesspring-sample.test.properties and spring-sample.prod.properties. Please notice that the only difference in their names is the words "dev", "test" and "prod" after "spring-sample".
The idea is to tell Spring to load the appropriate file based on an environment variable that we will call "env".
In a PRODUCTION server, the value "env" must be equal to "prod" so our application loads the spring-sample.prod.properties file. Same thing for DEV and TEST environments.
So you must create a new environment variable in your system having "env" as a name, and either "dev", "test" or "prod" as value. You can change the value later to check that the right file is being loaded.
After creating the variable, you must shutdown Eclipse and start it again so that it can be aware of the newly created variable. Notice, that if you click on "Restart" eclipse, the JVM does not exit, and by that the new variable is not discovered.
After that, let's edit the applicationContext.xml file to add following entry:

<!-- application.properties will contain all our config data: db username, 
  password, etc... -->
 <context:property-placeholder
  location="classpath:spring-sample.${env}.properties"/>
As the documentation says, the "property-placeholder" entry "Activates replacement of ${...} placeholders by registering a PropertySourcesPlaceholderConfigurer within the
 application context". For example, if you want to define a dataSource to access the DB and have its parameters like jdbcUrl been defined in the properties file, you just write the following:

<bean id="dataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource"  destroy-method="close">
 <property name="driverClass" value="${db.className}" />
 <property name="jdbcUrl" value="${db.url}" />
 <property name="user" value="${db.username}" />
 <property name="password" value="${db.password}" />     
</bean>
And then in your spring-sample.dev.properties file (we suppose your "env" variable value is equal to "dev"), define values for db.className, db.url etc...
e.g: db.username=raissi.

And that's it!! Very simple, but very useful. By now, depending on your environment type, Spring will load the appropriate properties file.

4. Add Logback

Logback is a logging library for Java. It "is intended as a successor to the popular log4j project". Written by the same author of log4j, Logback offers a faster implementation, "Logback is intended as a successor to the popular log4j project". More reasons why you should use Logback can be found here.
To use Logback, you need firstly to add its dependencies to your pom.xml:

<!-- SLF4J -->
  <dependency>
   <groupId>org.slf4j</groupId>
   <artifactId>slf4j-api</artifactId>
   <version>1.7.5</version>
  </dependency>

  <dependency>
   <groupId>org.slf4j</groupId>
   <artifactId>jcl-over-slf4j</artifactId>
   <version>1.7.5</version>
  </dependency>

  <!-- Log Back -->
  <dependency>
   <groupId>ch.qos.logback</groupId>
   <artifactId>logback-classic</artifactId>
   <version>1.1.1</version>
  </dependency>
  <dependency>
   <groupId>ch.qos.logback</groupId>
   <artifactId>logback-core</artifactId>
   <version>1.1.1</version>
  </dependency>

You may also have noticed, that in Spring dependencies I excluded commons-logging. In fact, Spring uses by default Commons Logging for their logs. And to make it use Logback, you must exclude commons-logging like I did for Spring dependencies
Now, let's configure Logback. It's very simple, all you need is to add (like log4j) an XML file called logback.xml to your application classpath:


P.S: If you were using Log4j and you want just to convert your log4j.properties file automatically, there is an online tool for this.
Before configuring our loggings, let's explain what we want to do:

  1. The console must display all logs (depending on our global log level)
  2. There must be a log file for every level, one for debug, one for error and one for info messages
  3. There must be some particular messages (of any level) that must be written to a special file. Let's say they are special messages intended for admins of the application.
  4. When reaching a particular size, the log file is compressed in a zip archive.
from the docs, "Logback delegates the task of writing a logging event to components called appenders". So you need to define an appender for each specific need. for example, to write to the console:

<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
  <encoder>
   <pattern>%d{[yyyy-MM-dd] HH:mm:ss.SSS} %-5level %logger{36} - %msg%n</pattern>
  </encoder>
 </appender>
This does not do so much. Let's now define an appender that writes info messages to a specific file:


<!-- We use a RollingFileAppender to backup the log files depending in the previously mentioned ZIP archives -->
 <appender name="FILE-INFO"
  class="ch.qos.logback.core.rolling.RollingFileAppender">
  <!-- The file location -->
  <file>${log.basefolder}/${log.info.filename}</file>
 
  <!-- The rolling policy how to rollover files 
    Here I am chosing to to keep up to 3 zip archives, and then delete the oldest one
  -->
  <rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
   <fileNamePattern>${log.rolling.folder}/sample-info.%i.log.zip</fileNamePattern>
   <minIndex>1</minIndex>
   <maxIndex>3</maxIndex>
  </rollingPolicy>
 
  <!-- The trigger to rollover a file, here I'm using a size based trigger. When file is up to maxFileSize, a rollover takes place -->
  <triggeringPolicy
   class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
   <maxFileSize>${log.file.maxSize}</maxFileSize>
  </triggeringPolicy>
  
  <!-- The pattern for our messages, just like Log4j -->
  <encoder>
   <pattern>%d{[yyyy-MM-dd] HH:mm:ss.SSS} %-5level %logger{36} - %msg%n</pattern>
  </encoder>
  <!-- The level filter, accept only INFO messages in this appender -->
  <filter class="ch.qos.logback.classic.filter.LevelFilter">
   <level>INFO</level>
   <onMatch>ACCEPT</onMatch>
   <onMismatch>DENY</onMismatch>
  </filter>
 </appender>
You can see in the comments inside the code that we are using an Appender to accept only INFO messages, and to write them to a specific file. We are also using a Rollover policy to backup files into up to 3 zip archives, and then when max size exceeded, delete the oldest one and continue.
One other thing to notice, is the use of placeholders like: ${log.info.filename}. 
To be able to use such placeholders, you neeed to add a property to specify from where to load these placeholder values:

<property resource="spring-sample.${env}.properties" />
And here we are referring to our environment dependent resource file, previously used with Spring. The nice thing here, is that Logback recognizes environment variables, just like Spring does.
Here is an example of my  spring-sample.dev.properties file:

log.basefolder=path-to-a-folder-that-will-contain-our-log-files
log.info.filename=sample-info.log
log.debug.filename=sample-debug.log
log.error.filename=sample-error.log
log.audit.filename=sample-audit.log

#Max size for a log file
log.file.maxSize=5MB
#folder to save in log files when the log size exceeds maxSize
log.rolling.folder=path-to-a-folder-that-will-contain-our-log-files-archives

Same thing should be done for debug and error levels.
Another thing we need with our logs, is to have a custom file for admin messages. To do so, we need to use another filter for our appender other than the LevelFilter previously used with INFO messages.
this time, we will use an EvaluatorFilter that use Markers to decide of the type of messages:

<appender name="AUDIT_FILE" class="ch.qos.logback.core.FileAppender">
  <!-- the filter element -->
  <filter class="ch.qos.logback.core.filter.EvaluatorFilter">
   <evaluator class="ch.qos.logback.classic.boolex.OnMarkerEvaluator">
    <!-- you can use any other value, just make sure, you use the same value in your Java code -->
    <marker>AUDIT_SYS</marker>
   </evaluator>
   <onMismatch>DENY</onMismatch>
   <onMatch>ACCEPT</onMatch>
  </filter>
  
  <file>${log.basefolder}/${log.audit.filename}</file>
  <encoder>
   <pattern>%d{[yyyy-MM-dd] HH:mm:ss.SSS} %level %logger{36} - %msg %n</pattern>
  </encoder>

  <rollingPolicy class="ch.qos.logback.core.rolling.FixedWindowRollingPolicy">
   <fileNamePattern>${log.rolling.folder}/sample-audit.%i.log.zip</fileNamePattern>
   <minIndex>1</minIndex>
   <maxIndex>3</maxIndex>
  </rollingPolicy>

  <triggeringPolicy
   class="ch.qos.logback.core.rolling.SizeBasedTriggeringPolicy">
   <maxFileSize>${log.file.maxSize}</maxFileSize>
  </triggeringPolicy>
 </appender>
Here the only that changed (compared to previous appender for INFO messages) is the Filter part.

The last thing to do, is tell Logback to use our defined appenders, and also to specify the global log level for our application:

 <!-- Levels by packages and classes: --> 
 <!-- you can define as many as you want loggers. just like in Log4j, 
 and this may be also by class or by package -->
    <logger name="com.sample.services" level="debug"/>
    <logger name="org.springframework.jdbc.core" level="TRACE">
    <appender-ref ref="STDOUT" />
    </logger>
 <logger name="com.sample.Foo" level="info"/>
 
 <root level="debug">
  <appender-ref ref="STDOUT" />
  <appender-ref ref="FILE-INFO" />
  <appender-ref ref="FILE-DEBUG" />
  <appender-ref ref="FILE-ERROR" />
  <appender-ref ref="AUDIT_FILE" />
 </root>
The last thing to do now is to log some messages from our Java code:
package com.raissi;

import org.junit.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.Marker;
import org.slf4j.MarkerFactory;

public class LogbackDemoClass {

    private static final Logger LOGGER = LoggerFactory.getLogger(LogbackDemoClass.class);
    private static final Marker CACHE_LOG = MarkerFactory.getMarker("AUDIT_SYS");
    
    @Test
    public void testLogs(){
        LOGGER.debug("This message will go to debug file only, and will contain a param: {}", "paramValue");
        LOGGER.debug(CACHE_LOG, "This message will go to debug file and admin log file also, and will contain a param: {}", "paramToAdmin");
    }
}
Few thing are to note here:

  • We use the same value to build the Marker object as the one defined in logback.xml file
  • No need to test if debug is enabled like we should do in other logging libraries
  • We use placeholders to introduce parameters, so that the complete String message is only built if the message is really going to be printed
By now, you should have a very good and convenient Logback configuration. So enjoy logging!!

5. Caching with Ehcache

The last part of this article, is to configure Ehcache to be used with our Spring application. It's mainly used to cache expensive calls that have results which change rarely, or at known rate.
If you google for "Ehcache with Spring example", you will find so many tutorials about this subject. So why am I writing about it again ? It's to address a point rarely considered on these tutorials. 
So let's begin by configuring Ehcache for our application.
First thing to do, is to add Ehcache dependecies to you pom.xml file:

  <!-- Ehcache -->
  <dependency>
   <groupId>net.sf.ehcache</groupId>
   <artifactId>ehcache</artifactId>
   <version>2.7.4</version>
  </dependency>

Next you need to configure Spring to use Ehcache for our caching. In fact, Spring can be configured to use multiple cache implementations. See the docs for more details.
So, in applicationContext.xml file, add the following:

<!-- Tell Spring that we going to use cache annotations in our Java code -->
 <cache:annotation-driven/>
 <bean id="cacheManager" class="org.springframework.cache.ehcache.EhCacheCacheManager"
  p:cache-manager-ref="ehcache" />
 <!-- EhCache library setup -->
 <bean id="ehcache"
  class="org.springframework.cache.ehcache.EhCacheManagerFactoryBean"
  p:config-location="classpath:ehcache.xml" />
Here we are referring to a file named "ehcache.xml", it's there where to configure Ehcache, about how to define our cache. see this page for details about what you should define there.
Here is my ehcache.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<ehcache xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xsi:noNamespaceSchemaLocation="http://ehcache.org/ehcache.xsd">
 <defaultCache eternal="false" maxElementsInMemory="100"
  overflowToDisk="false" />
 <cache name="spring-cache" maxElementsInMemory="1000000" eternal="false"
  overflowToDisk="false" />
</ehcache>
Main things to notice here are the cache named "spring-cache", we will refer to it in our cache annotations later. There also the maxElementsInMemory property that defines to max elements to be contained in this cache. You can define more than one cache.
Now in your Spring beans, you just add annotations to cache method calls, or to evict elements from cache:
Here is a simple example of a service class:

package com.raissi;

import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.Marker;
import org.slf4j.MarkerFactory;
import org.springframework.cache.annotation.CacheEvict;
import org.springframework.cache.annotation.Cacheable;
import org.springframework.stereotype.Service;

@Service
public class PersonServiceImpl implements PersonService{
 
 private static final Logger LOGGER = LoggerFactory.getLogger(PersonServiceImpl.class);
 private static final Marker CACHE_LOG = MarkerFactory.getMarker("AUDIT_SYS");
 
 private List<String> units = new ArrayList<>(Arrays.asList("UNIT01", "UNIT02", "UNIT03"));
 private List<String> persons = new ArrayList<>(Arrays.asList("PERSON01", "PERSON02", "PERSON03"));
 
 
 @Override
 @Cacheable(value = "spring-cache")
 public List<String> getPersonNames(String department){
  LOGGER.info(CACHE_LOG, "Getting Person names of department {}", department);
  return new ArrayList<>(persons);
 }

 @Override
 @Cacheable(value = "spring-cache")
 public List<String> getDepartmentUnitNames(String department) {
  LOGGER.info(CACHE_LOG, "Getting unit names of department {}", department);
  return units;
 }
 
 @Override
 @CacheEvict(value = { "spring-cache" }, key="#root.targetClass.getName() + 'getPersonNames' + #department")
 public void savePersonInDepartment(String person, String department){
  persons.add(person);
 }
 
 @Override
 public void savePersonInDepartmentNoEvict(String person, String department){
  persons.add(person);
 }
}

Just very simple. whenever we want to cache a method call, we annotate it with Cacheable, giving the cache name as value.
The only point that may be a little tricky here is the CacheEvict annotation. This is used, to evict an entry from the cache. Let me explain.
In the case of the getPersonNames(String department) method, we are caching its results, for example, when first time, you call getPersonNames("DEPT01") the call is going to invoke the method implementation returning list of persons in department "DEPT01".
Next time the method is called, the result is fetched from the cache, which means, the method implementation won't be invoked.
Now, what if we want to add a new person to this department. In this situation, we have to evict the entry associated with the DEPT01 from the cache. This is done by annotating the method that modifies the content of DEPT01 with CacheEvict. We need to give the annotation the key of the entry to be deleted from the cache. Which gets us to the main point of the part of the article. The cache keys.

Default cache key generator in Spring

When caching a method call (which is similar to putting an object in a map), Spring generates a key for it so it can be got next time a call made to that method. For this, Spring offers a default key generation mechanism. This is done via DefaultKeyGenerator in Spring versions prior to Spring 4. In Spring 4, the default key generator is SimpleKeyGenerator.
If you look at the code of key generation :

@Override
 public Object generate(Object target, Method method, Object... params) {
  if (params.length == 0) {
   return SimpleKey.EMPTY;
  }
  if (params.length == 1) {
   Object param = params[0];
   if (param != null && !param.getClass().isArray()) {
    return param;
   }
  }
  return new SimpleKey(params);
 }
And here, you can notice that only the parameters of the method are considered in key generating.
You can refer to this Jira issue to see a discussion about it.
So, by default, both our cached methods getPersonNames and getDepartmentUnitNames would return same values on their second calls with the same "DEPT01" value as parameters.
And this would be really catastrophic if not considered.
So what to do ?
First solution, would be to add a key to every Cacheable annotation like this:

@Cacheable(value="atlas", key="#root.targetClass + #root.methodName + #department")
This will add the class name and the method name to the generated key. And this really solves the problem.

Custom key generator

Using the mentioned solution to give special key to every method, solves the problem, but this would be a tedious task to add a key to every Cacheable annotation, and it would be a bug generator, in case one forgets to include a parameter in the key.
A better solution for this would be to tell Spring to use our custom key generator instead of the SimpleKeyGenerator.
For this, we need to add a new class implementing the KeyGenerator interface:


package com.raissi.spring.cache;

import java.lang.reflect.Method;

import org.springframework.cache.interceptor.KeyGenerator;

public class CacheKeyGenerator implements KeyGenerator {

 @Override
 public Object generate(final Object target, final Method method,
   final Object... params) {
  StringBuilder key = new StringBuilder(method.getDeclaringClass().getName()).append(method.getName());
  if(params != null){
   for(Object obj: params){
    key.append(obj.toString());
   }
  }
  return key.toString();
 }
}
Now we are including the method and class names in the generated key. The only thing that remains is to tell Spring to use our CacheKeyGenerator instead of SimpleKeyGenerator.
In applicationContext.xml; 
<!-- Change it to reference our KeyGenerator class -->
<cache:annotation-driven key-generator="cacheKeyGenerator" />
<bean id="cacheKeyGenerator" class="com.raissi.spring.cache.CacheKeyGenerator" />
Notice that I changed the cache:annotation-driven to include a key-generator attribute.

And that's it!

6. Testing your Spring applications with JUnit

The final part of this article is to show you how to test your beans with JUnit and Spring-test.
First add following dependencies:

<!-- Tests -->
  <dependency>
   <groupId>org.springframework</groupId>
   <artifactId>spring-test</artifactId>
   <version>${spring.version}</version>
  </dependency>
  <!-- JUnit -->
  <dependency>
   <groupId>junit</groupId>
   <artifactId>junit</artifactId>
   <version>4.11</version>
   <scope>test</scope>
  </dependency>

  <!-- Mockito -->
  <dependency>
   <groupId>org.mockito</groupId>
   <artifactId>mockito-core</artifactId>
   <version>1.9.5</version>
   <scope>test</scope>
  </dependency>
Next, go to "Java resources, src/test/java" and create a new package:

There we will create a base class: AbstractContextTests, all other test classes will extend it. This class will define context config location of Spring, it ensures that a WebApplicationContext will be loaded for the test:

package com.raissi.spring.test;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.web.WebAppConfiguration;
import org.springframework.web.context.WebApplicationContext;

@WebAppConfiguration
@ContextConfiguration(value={"file:src/main/webapp/WEB-INF/applicationContext.xml"})
public class AbstractContextTests {

 @Autowired
 protected WebApplicationContext wac;

}
And now let's create our test classes:
package com.raissi.spring.test;

import org.junit.Test;
import org.junit.runner.RunWith;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.Marker;
import org.slf4j.MarkerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;

import com.raissi.PersonService;

@RunWith(SpringJUnit4ClassRunner.class)
public class PersonServiceTest extends AbstractContextTests {

 private static final Logger logger = LoggerFactory.getLogger(PersonServiceTest.class);
 private static final Marker CACHE_LOG = MarkerFactory.getMarker("AUDIT_SYS");
 
 @Autowired
 private PersonService personService;
 
 @Test
 public void test(){
  final String dept1 = "DEPT1";
  for(int i=0; i <3; i++){
   logger.info("Calling PersonService for unit names of dept {}, got {}", dept1, personService.getDepartmentUnitNames(dept1));
   logger.info("Calling PersonService for persons names of dept {}, got {}", dept1, personService.getPersonNames(dept1));
  }
  
  logger.debug(CACHE_LOG, "Adding new person {} to dept: {} with evict", "PERS04", dept1);
  personService.savePersonInDepartment("PERS04", dept1);
  logger.debug(CACHE_LOG, "Calling PersonService for persons names of dept {}, got {}", dept1, personService.getPersonNames(dept1));
  
  logger.info("Adding new person {} to dept: {} with no evict", "PERS05", dept1);
  personService.savePersonInDepartmentNoEvict("PERS05", dept1);
  logger.info("Calling PersonService for persons names of dept {}, got {}", dept1, personService.getPersonNames(dept1));
  
  
  logger.debug(CACHE_LOG, "Adding new person {} to dept: {} with evict", "PERS06", dept1);
  personService.savePersonInDepartment("PERS06", dept1);
  logger.debug(CACHE_LOG, "Calling PersonService for persons names of dept {}, got {}", dept1, personService.getPersonNames(dept1));
 }
}

And that's it, you just right click on the class and chose Run As -> JUnit Test.

I hope this will be of some help.

Friday, April 25, 2014

JSF 2.2 custom Converter for Primefaces Calendar

Recently, I've got some special requirements for the calendar component of Primefaces. The requirements are:

  • When user enters "300130" for example, the date must be automatically converted to: "30/01/2030" (French pattern: dd/MM/yyyy). Years before 30 are converted to 2000s, those after 30 are converted to 1900s. So, "150277" will be: "15/02/1977"
  • User can enter values with or without slashes so "010202" is the same as "01/02/02" and both must be transformed to "01/02/2002"
  • The dates are displayed independently to Time Zone. So when entering 01/01/2000 it must always be displayed as 01/01/2000 (see this SO question for details)
  • Another requirement that has nothing to do with converter is that the calendar icon must not be selected when clicking on "Tabulation" after filling the date input. This means the tabindex of the icon must be of value -1.

First thoughts

Here for the dates we have more than a pattern to be used in the same calendar. And because of this, the pattern attribute of p:calendar won't be of any use except for formatting dates entered by selecting a value from the calendar dialog.
We have mainly two patterns to be used: yy/MM/yyyy and yyMMyyyy. But, we need to convert values of years having two digits as specified in the first point of the requirements.
So let's begin by implementing a little algorithm for date transformations.
Please note that we will be using Joda-Time for our date manipulations.

Date processing


/**
/**
  * A methods that takes a String value, and format it. Then it transforms years before 30 to 2000s.
  * Years after 30 are transformed to 1900s
  * If the entered value can't be formatted or if the year is contained between 100 and 999,
  * This method returns {@link javax.faces.application.FacesMessage} instead of a Date
  * @param context
  * @param component
  * @param value the String value entered by the user
  * @param pattern the pattern
  * @return
  * @throws ConverterException
  */
 private Object fixDate(FacesContext context, UIComponent component, String value, String pattern) throws ConverterException{
  DateTimeFormatter formatter = DateTimeFormat.forPattern(pattern);
  LocalDateTime ldt = null;
  try{
   ldt = formatter.parseLocalDateTime(value);
  }catch(Exception e){
            Object[] params = new Object[3];
            params[0] = value;
            params[1] = formatter.print(new DateTime( new Date()));
            params[2] = MessageFactory.getLabel(context, component);
            
            return MessageFactory.getMessage("javax.faces.converter.DateTimeConverter.DATE", FacesMessage.SEVERITY_ERROR, params);
  }
  //Get the year and see if the year value is valid, i.e. year must be < 100 or >=1900
  int yy = ldt.getYear();
  if(yy >= 100 && yy <1900){
   return MessageFactory.getMessage(
                    context, "javax.faces.converter.DateTimeConverter.DATE", value,
                    MessageFactory.getLabel(context, component));
  }
  if(yy < 100){
   int c = yy%100;
   if(c <= 30){
    yy = c + 2000;
   }else{
    yy = c + 1900;
   }
   return ldt.withYear(yy).toDate();
  }
  return ldt.toDate();

 }
The code of this method is very simple. And there are comments to help you.

Calendar component

If we had only one pattern, then we would use the pattern attribute of the p:calendar component of Primefaces. But here we have two patterns, and for this, we will use an additional attribute (this is possible with JSF 2.x). So for all my calendar components I will use a custom attribute that will handle all patterns for our dates, example:

<p:calendar converter="#{ourCustomConverter}" custompattern="dd/MM/yyyy;ddMMyyyy" pattern="dd/MM/yyyy" showOn="button" value="#{someBean.someAttr}">
<p:ajax event="change" partialSubmit="true" process="@this" update="@this">
</p:ajax>
</p:calendar>
Here we will still use the pattern attribute for the selection of dates in the calendar dialog.

The custom Converter

Now we will use the custom attribute and the earlier mentioned date transforming method to process user entered dates. For this, we will use a custom converter. For more information about JSF converters please refer to this page. Mainly, we will define the two methods of the javax.faces.convert.Converter interface:
  • getAsObject: this method will convert a String value (technically it's called the "submitted value") to a model data object that will be used during the validation phase as a "local value".
  • getAsString: in some way, this method is the inverse of the previous one, it's used to get a String value from the model objects (dates in our case). The generated Strings are the ones that will be displayed to end user.
So here is our Converter:

import java.text.SimpleDateFormat;
import java.util.Date;

import javax.faces.application.FacesMessage;
import javax.faces.component.UIComponent;
import javax.faces.context.FacesContext;
import javax.faces.convert.ConverterException;
import javax.faces.convert.DateTimeConverter;

import org.joda.time.DateTime;
import org.joda.time.LocalDate;
import org.joda.time.LocalDateTime;
import org.joda.time.format.DateTimeFormat;
import org.joda.time.format.DateTimeFormatter;
import org.springframework.stereotype.Component;

import com.sun.faces.util.MessageFactory;

/**
 * Converter to be used with dates.
 * Usage: converts a date years, ex: 01/01/14 will be transformed to 01/01/2014 
 * @author Laabidi RAISSI
 *
 */
@Component("ourCustomConverter")
public class DateTimeCustomConverter extends DateTimeConverter{

 private static final Date DEFAULT_END_DATE = new LocalDate(2999, 12, 31).toDate();
 
 @Override
 public Object getAsObject(FacesContext context, UIComponent component, String value) {
  if(value == null){
   return null;
  }
  value = value.split(";")[0]; 
  String pattern = (String)component.getAttributes().get("custompattern");
  String[] patterns = pattern.split(";");
  Object ret = null;
  for(String pat: patterns){
   ret = fixDate(context, component, value, pat);
   if(ret instanceof Date){
    return ret;
   }
  }
  throw new ConverterException((FacesMessage)ret);
 }
 
 public String getAsString(FacesContext context, UIComponent component, Object value) {
  if(value == null){
   return "";
  }
  if (context == null || component == null) {
   throw new NullPointerException();
  }

  try {
   String pattern = ((String)component.getAttributes().get("custompattern")).split(";")[0];
   SimpleDateFormat dateFormat = new SimpleDateFormat(pattern, getLocale());
   String res = dateFormat.format(value);
   String defaultStr = dateFormat.format(DEFAULT_END_DATE);
   if(defaultStr.equals(res)){
    return "";
   }
   return dateFormat.format(value);

  } catch (ConverterException e) {
   throw new ConverterException(MessageFactory.getMessage(context, STRING_ID, value, MessageFactory.getLabel(context, component)), e);
  } catch (Exception e) {
   throw new ConverterException(MessageFactory.getMessage(context, STRING_ID, value, MessageFactory.getLabel(context, component)), e);
  }
 }
 

 /**
  * A methods that takes a String value, and format it. Then it transforms years before 30 to 2000s.
  * Years after 30 are transformed to 1900s
  * If the entered value can't be formatted or if the year is contained between 100 and 999,
  * This method returns {@link javax.faces.application.FacesMessage} instead of a Date
  * @param context
  * @param component
  * @param value the String value entered by the user
  * @param pattern the pattern
  * @return
  * @throws ConverterException
  */
 private Object fixDate(FacesContext context, UIComponent component, String value, String pattern) throws ConverterException{
  DateTimeFormatter formatter = DateTimeFormat.forPattern(pattern);
  LocalDateTime ldt = null;
  try{
   ldt = formatter.parseLocalDateTime(value);
  }catch(Exception e){
            Object[] params = new Object[3];
            params[0] = value;
            params[1] = formatter.print(new DateTime( new Date()));
            params[2] = MessageFactory.getLabel(context, component);
            
            return MessageFactory.getMessage("javax.faces.converter.DateTimeConverter.DATE", FacesMessage.SEVERITY_ERROR, params);
  }
  //Get the year and see if the year value is valid, i.e. year must be < 100 or >=1900
  int yy = ldt.getYear();
  if(yy >= 100 && yy <1900){
   return MessageFactory.getMessage(
                    context, "javax.faces.converter.DateTimeConverter.DATE", value,
                    MessageFactory.getLabel(context, component));
  }
  if(yy < 100){
   int c = yy%100;
   if(c <= 30){
    yy = c + 2000;
   }else{
    yy = c + 1900;
   }
   return ldt.withYear(yy).toDate();
  }
  return ldt.toDate();

 }
}
Please notice that I am using Spring annotations for my project. You can replace the @Component with @FacesConverter.
And that's all we need for our dates conversions.

Tabindex for the calendar icon

This has nothing to do with the converter subject. But since I'm talking about calendars, I thought it might be of use. 
By default, the button that triggers the calendar popup in Primefaces is selected when moving out of the input date via "Tabulation" button. This can be of no use in some situations (like mine since in 90% of cases, user will enter dates manually than selecting it from popup).

To change this behaviour, we need to set the tabindex of this icon to be "-1". To do this, I just used the simplest solution: overriding the JS file of Primefaces named calendar.js with a slightly changed one. All you need to do, is to search in  file for the string: ".ui-datepicker-trigger:button" and add the following line:
triggerButton.attr('tabindex', -1);. So, it will look something like this:

//extensions
        if(this.cfg.popup && this.cfg.showOn) {
            var triggerButton = this.jqEl.siblings('.ui-datepicker-trigger:button');
            triggerButton.html('').addClass('ui-button ui-widget ui-state-default ui-corner-all ui-button-icon-only')
                        .append('ui-button');

            var title = this.jqEl.attr('title');
            if(title) {
                triggerButton.attr('title', title);
            }
            triggerButton.attr('tabindex', -1);
            PrimeFaces.skinButton(triggerButton);
            $('#ui-datepicker-div').addClass('ui-shadow');
        }
The last thing to do, is to tell JSF to include our calendar.js file at last so it overrides the one in Primefaces, for this, use the f:facet tag:

<f:facet name="last">
        <h:outputScript library="js" name="calendar.js"/>
</f:facet>
And that is it!

Friday, July 19, 2013

Use Spring JavaMailSender and Freemarker to send Newsletter from your JSF2 applications

Newsletters are a very powerful way to keep in touch with your web site users. They also are widely used as a mean of marketing. So how to generate and send a Newsletter in your JSF2 application ?

1) Use a template engine

As stated in Wikipedia, a template engine is "a software that is designed to process web templates and content information to produce output web documents".
So the idea is very simple, like when dealing with Facelets pages, we define a template page for our Newsletter, and then use the template engine to generate a new text based on merging this template and data we pass to it.
There are so many template engines in the open source market. Between them there is Freemarker, Velocity, StringTemplate, Thymeleaf and so many others. Personally I worked with Velocity and Freemarker. Both are very flexible and very powerful. 
If you want to use Velocity with your Newsletters you can find a little example for Spring integration here. In this article we will be using Freemarker.

2) Pick a template for Newsletter

First thing to do (just as when developing a web page) is to design our Newsletter. You can ask your designer to create a static pure HTML Newsletter template. For me, I just chose this free template. And here is a screenshot of it:

It's quite simple. It contains a list of head titles (under "In this issue"). It contains also a list of latest articles: every article will contain a title, a description and eventually an image (the image can be null). The newsletter will also contain a link to unsubscribe from our mailing list. 

3) Add Maven dependencies

You need to add Freemarker, JavaMail (required by Spring mail) and if you didn't already include it, Spring Context support. So in your POM file, make sure to include these dependencies:



  org.springframework
  spring-context-support
  ${org.springframework.version}



 javax.mail
 mail
 1.4.7

        

 org.freemarker
 freemarker
 2.3.14

4) Data model

Now we need to prepare our data model (if you didn't already) for the newsletter. As I said, we will display a list header titles (let's say this will present flash news), and a list of latest articles and an unsubscribe link.
This is our Article class:

package com.raissi.domain.newsletter;

import java.io.Serializable;

public class Article implements Serializable{
	private static final long serialVersionUID = 2999207145055407788L;

	private String title;
	private String image;
	private String description;
	
	public Article() {
		super();
	}
	public Article(String title, String image, String description) {
		super();
		this.title = title;
		this.image = image;
		this.description = description;
	}
	
	public String getTitle() {
		return title;
	}
	public void setTitle(String title) {
		this.title = title;
	}
	public String getImage() {
		return image;
	}
	public void setImage(String image) {
		this.image = image;
	}
	public String getDescription() {
		return description;
	}
	public void setDescription(String description) {
		this.description = description;
	}		
}
For simplicity matter, I will use just a map of (title,url) pairs to display header titles. As for the unsubscribe link, it will point to unsbscribe-newsletter?token=encryptedUserEmail.

5) Implementation

5-a) Spring config

Spring provides a JavaMailSender utility that helps with handling mails, we will use it, 

	
	
	
	
		
			${mail.smtp.auth}
			${mail.smtp.port}
			${mail.host}
			true
		
	

Now add the Freemarker Configuration bean factory:




	
	
	

I think comments well explain each element in the above config.
Now let's create a service class that will be responsible for processing the template, generating the mail message and sending it via defined mailSender bean, in Spring add:

	
	

5-b) Service classes

And here is the MailService class:
package com.raissi.service.mail;

import java.util.Map;

import javax.mail.internet.MimeMessage;

import org.springframework.mail.javamail.JavaMailSender;
import org.springframework.mail.javamail.MimeMessageHelper;
import org.springframework.mail.javamail.MimeMessagePreparator;
import org.springframework.ui.freemarker.FreeMarkerTemplateUtils;

import freemarker.template.Configuration;

public class MailService {

	private JavaMailSender javaMailSender;
	private Configuration freemarkerConfiguration;
	
	public void sendMail(final String from, final String to, final String subject, final Map model, final String template){
		MimeMessagePreparator preparator = new MimeMessagePreparator() {
	         public void prepare(MimeMessage mimeMessage) throws Exception {
	            MimeMessageHelper message = new MimeMessageHelper(mimeMessage);
	            message.setFrom(from, "Raissi JSF2 sample");
       		    message.setTo(to);
       		    message.setSubject(subject);
       		    //template sample: "com/raissi/freemarker/confirm-register.ftl"
                String text = FreeMarkerTemplateUtils.processTemplateIntoString(freemarkerConfiguration.getTemplate(template,"UTF-8"), model);
	            message.setText(text, true);
	         }
	      };
		javaMailSender.send(preparator);
	}

	public void setJavaMailSender(JavaMailSender javaMailSender) {
		this.javaMailSender = javaMailSender;
	}

	public void setFreemarkerConfiguration(Configuration freemarkerConfiguration) {
		this.freemarkerConfiguration = freemarkerConfiguration;
	}	
}
The only tricky part of this class is the FreeMarkerTemplateUtils.processTemplateIntoString call. This method "Process the specified FreeMarker template with the given model and write the result to the given Writer." As for the model parameter, it's typically a Map that contains model names as keys and model objects as values.
This service class will be used by every class desiring to send an email with Freemarker as Template Engine in our application.
Now let's define a NewsLetterService class that will fetch data to be filled into the newsletter and then call mailService.sendMail:

package com.raissi.service.newsletter.impl;

import java.io.UnsupportedEncodingException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

import javax.inject.Inject;
import javax.inject.Named;

import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;

import com.raissi.domain.User;
import com.raissi.domain.newsletter.Article;
import com.raissi.service.UserService;
import com.raissi.service.mail.MailService;
import com.raissi.service.newsletter.NewsLetterService;

@Service("newsLetterService")
@Transactional
public class NewsLetterServiceImpl implements NewsLetterService{
	private static final long serialVersionUID = -6291547874161783407L;
	
	@Inject	
	private @Named("mailService")MailService mailService;
	@Inject
	private UserService userService;
	
	public void sendNewsLetter(User user){
		//Generate the Unsubscribe link, it will contain the user's email encrypted
		try {
			/*
			 * Will generate a complete url to the specified pageName and containing the tokenToBeEncrypted 
			 * as encrypted param, ex: http://mysite.com/confirm-registration?token=userNameEncrypted 
			 */
			String unsubscribeUrl = userService.generateUserToken("unsbscribe-newsletter", user.getEmail());
			//Get the site base url from the above url, and use it for images urls
			//It will be in the form: http://mysite.com/resources/freemarker
			String baseUrl = unsubscribeUrl.substring(0,unsubscribeUrl.indexOf("/unsbscribe")+1)+"resources/freemarker";
			List
latestArticles = getLatestArticles(); Map headerTitles = getHottestNews(); String newsSourceUrl = "http://www.richarddawkins.net/"; String newsSourceName = "Richard Dawkins Foundation for Reason and Science"; Map model = new HashMap(); model.put("unsubscribeUrl", unsubscribeUrl); model.put("latestArticles", latestArticles); model.put("headerTitles", headerTitles); model.put("newsSourceUrl", newsSourceUrl); model.put("newsSourceName", newsSourceName); model.put("baseUrl", baseUrl); mailService.sendMail("raissi.java@gmail.com", user.getEmail(), "Our Newsletter", model, "com/raissi/freemarker/newsletter.ftl"); } catch (UnsupportedEncodingException e) { // TODO Auto-generated catch block e.printStackTrace(); } } public Map getHottestNews(){ Map news = new HashMap(); //Just for testing purpose, we will generate a static list of news news.put("Richard Dawkins to headline unique Bristol event, Sat. 24th August","http://www.richarddawkins.net/news_articles/2013/7/19/richard-dawkins-to-headline-unique-bristol-event-sat-24th-august-2013"); news.put("Parliament 'must pardon codebreaker Turing'","http://www.richarddawkins.net/news_articles/2013/7/19/parliament-must-pardon-codebreaker-turing"); news.put("Curiosity team: Massive collision may have killed Red Planet","http://www.richarddawkins.net/news_articles/2013/7/19/curiosity-team-massive-collision-may-have-killed-red-planet"); news.put("Tyrannosaurus rex hunted for live prey","http://www.richarddawkins.net/news_articles/2013/7/18/tyrannosaurus-rex-hunted-for-live-prey"); return news; } public List
getLatestArticles(){ List
latestArticles = new ArrayList
(); //Just for testing purpose, we will generate a static list of Articles //Noam Chomsky String chomskyDesc = "Avram Noam Chomsky (/ˈnoʊm ˈtʃɒmski/; born December 7, 1928) is an American linguist," + " philosopher, cognitive scientist, logician, political critic, and activist. " + "He is an Institute Professor and Professor (Emeritus) in the Department of Linguistics & Philosophy at MIT, " + "where he has worked for over 50 years. " + "In addition to his work in linguistics, he has written on war, politics, and mass media, " + "and is the author of over 100 books.[13] Between 1980 and 1992, " + "Chomsky was cited within the field of Arts and Humanities more often than any other living scholar, " + "and eighth overall within the Arts and Humanities Citation Index during the same period." + " He has been described as a prominent cultural figure, and was voted the \"world's top public intellectual\" " + "in a 2005 poll.[18]"; String chomskyImg = "http://upload.wikimedia.org/wikipedia/commons/thumb/6/6e/Chomsky.jpg/200px-Chomsky.jpg"; Article noamChomsky = new Article("Noam Chomsky", chomskyImg, chomskyDesc); latestArticles.add(noamChomsky); //Richard Dawkins String dawkinsImg = "http://upload.wikimedia.org/wikipedia/commons/thumb/a/a0/Richard_Dawkins_Cooper_Union_Shankbone.jpg/250px-Richard_Dawkins_Cooper_Union_Shankbone.jpg"; String dawkinsDesc = "Clinton Richard Dawkins, FRS, FRSL (born 26 March 1941) is an English ethologist," + " evolutionary biologist and author. He is an emeritus fellow of New College, Oxford," + " and was the University of Oxford's Professor for Public Understanding of Science from 1995 until 2008."; Article richardDawkins = new Article("Richard Dawkins", dawkinsImg, dawkinsDesc); latestArticles.add(richardDawkins); //Stephen Hawking String hawkingDesc = "Stephen William Hawking CH, CBE, FRS, FRSA (Listeni/ˈstiːvɛn hoʊkɪŋ/; stee-ven hoh-king; born 8 January 1942) " + "is an English theoretical physicist, cosmologist, author and Director of Research at the Centre for Theoretical Cosmology" + " within the University of Cambridge. Among his significant scientific works have been a collaboration with " + "Roger Penrose on gravitational singularities theorems in the framework of general relativity, " + "and the theoretical prediction that black holes emit radiation, often called Hawking radiation." + " Hawking was the first to set forth a cosmology explained by a union of the general theory of " + "relativity and quantum mechanics. He is a vocal supporter of the many-worlds interpretation of quantum mechanics."; Article stephenHawking = new Article("Stephen Hawking", null, hawkingDesc); latestArticles.add(stephenHawking); //Paul Nizan String nizanImg = "http://upload.wikimedia.org/wikipedia/commons/thumb/3/31/Nizanpaul.jpg/220px-Nizanpaul.jpg"; String nizanDesc = "Paul-Yves Nizan (French: [nizɑ̃]; 7 February 1905 – 23 May 1940) was a French philosopher and writer. " + "He was born in Tours, Indre-et-Loire and studied in Paris where he befriended fellow student Jean-Paul Sartre at the Lycée Henri IV." + " He became a member of the French Communist Party, and much of his writing reflects his political beliefs, " + "although he resigned from the party upon hearing of the Molotov-Ribbentrop Pact in 1939. " + "He died in the Battle of Dunkirk, fighting against the German army in World War II."; Article paulNizan = new Article("Paul Nizan", nizanImg, nizanDesc); latestArticles.add(paulNizan); return latestArticles; } }

The code of this class is very simple, there is only one thing that may be not clear. It's the call to userService.generateUserToken. In fact this method, is just a simple utility method to generate a url containing the specified token param encrypted and pointing to the passed page name param, here is its implementation:
public String generateUserToken(String pageName, String tokenToBeEncrypted) throws UnsupportedEncodingException{
		HttpServletRequest request = ((ServletRequestAttributes) RequestContextHolder.getRequestAttributes()).getRequest();
		String domain = "http://"+request.getServerName()+":"+request.getServerPort();
		String context = servletContext.getContextPath();
		//As in the getContextPath() docs, The path starts with a "/" character but does not end with a "/" character 
		context = domain+context;
		String encryptedToken = URLEncoder.encode(textEncryptor.encrypt(tokenToBeEncrypted),"UTF-8");
		return context+"/"+pageName+"?token="+encryptedToken;
}
Now everything is ready, but only the template page.

5-c) The Freemarker Template

To create the template, just copy the code of static HTML template that you chose (or your designer) and modify it by adding dynamic parts.
Let's start by the header titles. We said they are links to some news. In our Java data model, we passed them within a Map<String, String>: the title of the news is the key and the url is the value. The map is then put into the model param under the headerTitles key. To display this map in Freemarker, we use this syntax:

<#list headerTitles?keys as title>
    ${title}
</#list>

As for the latest articles list, we passed them as a List<Article> object under the key with value: latestArticles, and here is the Freemarker code to display that list:
<#list latestArticles as article>
${article.title}
<#if article.image??>																															</#if>
${article.description}
</#list>
Of course, here I omitted the CSS code and other design related HTML code.
As for the unsubscribe link, you should be able to guess its value:

Unsubscribe
As we passed the url to unsubscribe under the unsubscribeUrl within model param.
This is how my Newsletter seems as received in my Yahoo mail account (a part of, that my screen can display):

By now you should be able to send any kind of newsletter to your users.

6) Final (and very important) remarks

6-a) Inline styles

If you are using CSS styles defined in the head section of your template, then most of email clients will ignore it. See this link. So what will you do ? 
The answer is to use inline style, for example: 
instead of defining a style class that won't be recognized.
Now you be saying, but f**k how will I transform all those CSS classes into inline style? Well, the answer is with this awesome site.

6-b) Asynchronous execution

If you will use the above java MailService class as it's, and if you provide the user a button or a link to send him newsletter (or any other king of emails) when clicked, then the UI will be blocked until email is sent. And since this may take some long time (depending on your Email server and other Internet params), the process of sending emails should run asynchronously. You may think about using a Java Thread, which is completely legitimate.
The good news, is that Spring (with its great magic) offers the possibility of running methods asynchronously by simply adding an annotation: @Async. So in your MailService class annotate your sendMail with Async.
Note you must add the following directive (XML element) to your application context file, so that Spring will recognize the Async annotation:


    
    


And by now everything should be just perfect.
It will be great to see your comments

Thursday, July 18, 2013

JSF 2, Spring, Spring Security, Hibernate/JPA, Jasypt, application sample available at Github

If you have followed my previous articles series about JSF 2 and different frameworks integration, then you can find the complete application source code on Github here.
Please feel free to post your comments, suggestions and ameliorations

Protect you users passwords in with Jasypt

As recommended by OWASP, when storing users credentials, you always should encrypt user's password in a way that protects it from being stolen. In fact, if you are saving them in clear, and once your DB has been stolen, or accessed by even your DBA, this will make all passwords compromised. Now what about a user that used the same password for your application and his bank online account?
This been said, you should encrypt these passwords. And to do it, you have one of two possibilities:
1) Encrypt the password and save it in DB. For every attempt to login, retrieve the encrypted password, decrypt it and compare it to given password.
2) When user is registering, generate a hash code for his password and save it in DB. For every attempt to login, compare the stored hash with the generated hash of given password when login happens.
Both methods are robust if using robust algorithms. But I prefer second one. In fact, using the hash codes, makes the user the only person who can know the real password value. The first one, makes it possible for application developer to guess it.
In this article we will use Jasypt library to implement both methods. Jasypt is a java library which allows the developer to add basic encryption capabilities to his/her projects with minimum effort, and without the need of having deep knowledge on how cryptography works.

1) Transparent password encryption with Hibernate

To implement the first approach to secure passwords, Jasypt offers a very simple and transparent enryption method. To do it, just declare the following bean in your spring context:



     
          hibernateStringEncryptor
     
     
         simplepassword
     

This will create a HibernatePBEStringEncryptor object and register it with the "hibernateStringEncryptor" name. You should use a strong password to be used when encrypting your data.

Now, we only need to add Jasypt annotations to properties we want to be encrypted transparently, here is the User entity class:
import org.hibernate.annotations.Parameter;
import org.hibernate.annotations.TypeDef;
import org.hibernate.annotations.Type;
import org.jasypt.hibernate4.type.EncryptedStringType;
@Entity
@Table(name="user_table")
@TypeDef(
        name="encryptedString", 
        typeClass=EncryptedStringType.class, 
        parameters={@Parameter(name="encryptorRegisteredName",
                               value="hibernateStringEncryptor")}
)
public class User implements Serializable{
   //properties, here, especially the password property that we want to be encrypted:

   @Type(type="encryptedString")
   private String password;

   //Getters and Setters etc...
}
And that's all you need to do, now you can save your user objects and check the password field in DB, it will be encrypted.
Now when you wish to login a user, just check password equality as follows:
@Transactional(readOnly=true)
public User loginUser(String login, String password) {
        User user = userDao.findUserByLoginOrEmail(login);
 if(user != null ){
  if(password.equals(user.getPassword() )){
   return user;
  }
 }
 return null;
}
As you can see, we are not performing any encryption operation on data. Everything is transparent.
Please notice: never and ever use the password field on a where sql(or hql or jpql) query once it's annotated with @Type(type="encryptedString"), since it will be stored as an encrypted value, and you have no mean to compare an encrypted value against it.

2) Digest (hash code) generation for passwords

Now let's see the secodn (and my preferred method) for storing passwords. First thing to do, is to create a StringDigester bean in Spring: 



Although you can just instantiate this object whenever needed, I just wanted it to be a singleton object in the whole application. Now inject it in your service class and use it to digest passwords when saving users:
@Inject
private @Named("stringDigester")StandardStringDigester digester;
@Transactional(propagation=Propagation.REQUIRES_NEW)
public void saveUser(User user){
        //Digest password and save it
 user.setPassword(digester.digest(user.getPassword()));
 userDao.save(user);
}

//Login method:
@Transactional(readOnly=true)
public User loginUser(String login, String password) {
 User user = userDao.findUserByLoginOrEmail(login);
 if(user != null ){
           //Call StandardStringDigester.matches to compare stored digest and provided password
  if(digester.matches(password, user.getPassword())){
   return user;
  }
 }
 return null;
}
And that's it, now you are sure that your passwords are stored in a safe way.

Wednesday, July 10, 2013

JSF 2, display PDF files stored at Amazon S3 for cloud storage

Lately I have been working on a personal project using JSF2. In that project I had to display PDF files to users on demand. I wanted to deploy the project in the cloud and I chose CloudBees. The only problem I encountered with CloudBees is that you cannot upload files via your application to their servers.
The solution was to store my files in an external storage provider. I chose Amazon Simple Storage Service (S3).
Now here is the challenge: I have my files stored at Amazon servers, and I need to remotely access these files and display them back to users via Google Docs Viewer. You may say that I could generate direct URL to these files (as discussed here) and then simply display them. But, what if you don't want them to be publicly accessible, or if you want to make them accessible under some conditions ?
In this article we will see how to do this. But first let's begin with uploading files to Amazon S3 via our JSF2 application.

1) The JetS3t Toolkit

The JetS3t is an open source application suite for Amazon S3, Amazon CloudFront content delivery network and Google storage for developers. It provides a very simple  API for interacting with these storage services. Here is POM dependencies for JetS3t:




 net.java.dev.jets3t
 jets3t
 0.9.0

2) Uploading files in JSF2

To upload files, we will use Primefaces uploader. To use it, you should define PrimeFaces FileUpload Filter in your web.xml descriptor:



 PrimeFaces FileUpload Filter
 org.primefaces.webapp.filter.FileUploadFilter


 PrimeFaces FileUpload Filter
 Faces Servlet
 
 FORWARD

Please notice that we set dispatcher to FORWARD. This because we are using Prettyfaces (it's filter dispatcher is also set to FORWARD). Without doing this, you may encounter some problems.
You also must add dependencies of two additional APIs: commons-io and commons-fileupload. It's not mentioned in Primefaces docs, but without doing this, you will get many ClassNotFoundExceptions related to theses libraries:



 commons-fileupload
 commons-fileupload
 1.3


 commons-io
 commons-io
 2.4

Now everything is set alright. Let's begin with implementation. The JSF upload part is very simple. You simply define a managed bean containing an UploadedFile property and two methods:
package com.raissi.managedbeans;
import java.io.IOException;
import java.io.Serializable;
import javax.faces.event.ActionEvent;
import javax.inject.Inject;
import javax.inject.Named;

import org.primefaces.event.FileUploadEvent;
import org.primefaces.model.UploadedFile;
import org.springframework.context.annotation.Scope;
import org.springframework.stereotype.Component;

import com.raissi.domain.Resume;
import com.raissi.service.ResumeService;

@Component
@Scope("view")
public class HomeManagedBean implements Serializable{
 private static final long serialVersionUID = 5426154702541976181L;
 @Inject
 private ResumeService resumeService;
        private Resume resume = new Resume();
 private UploadedFile file;

 public UploadedFile getFile() {
          return file;
 }

  public void setFile(UploadedFile file) {
          this.file = file;
 }
    
 public void fileUploadListener(FileUploadEvent event){
       file = event.getFile();
 }
        public void upload() {
          if(file != null) {
      try {
       resume.setDocumentName(file.getFileName());
    resumeService.persistCvContent(file.getInputstream(), resume);
    loggedInUser.getUser().setResume(resume);
    FacesMessage msg = new FacesMessage("Succesful", file.getFileName() + " is uploaded.");
             FacesContext.getCurrentInstance().addMessage(null, msg);
  } catch (IOException e) {
    FacesMessage msg = new FacesMessage("File ", file.getFileName() + " couldn't be uploaded. Please contact admins");
             FacesContext.getCurrentInstance().addMessage(null, msg);
    e.printStackTrace();
  }
          }
        }
}
And here is our xhtml code:

      
      
     
      
       
       
       
       
       
        
                
                
                
                
      
     
    
     
The Resume class referenced in the managed bean, is just a domain class that we defined in last article. It will contain data related to user's resume and will be persisted in DB. The key class here is the ResumeService bean which is responsible for Resume persistence in both DB via ResumeDao and in remote Amazon S3 store. Before we go through document persistence in Amazon S3, please make sure you got your your Amazon S3 credentials: AWSAccessKeyId (access key ID) and AWSSecretKey (Secret Key) since we will use them to store/retrieve data from S3.

3) Persisting files to Amazon S3 

Now we go to examine the ResumeService class:

package com.raissi.service.impl;

import java.io.IOException;
import java.io.InputStream;
import java.io.Serializable;
import java.security.NoSuchAlgorithmException;

import javax.annotation.PostConstruct;
import javax.inject.Inject;
import javax.inject.Named;

import org.apache.commons.io.IOUtils;
import org.jets3t.service.S3Service;
import org.jets3t.service.S3ServiceException;
import org.jets3t.service.ServiceException;
import org.jets3t.service.impl.rest.httpclient.RestS3Service;
import org.jets3t.service.model.S3Bucket;
import org.jets3t.service.model.S3Object;
import org.jets3t.service.security.AWSCredentials;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.transaction.annotation.Transactional;

import com.raissi.dao.ResumeDao;
import com.raissi.domain.Resume;
import com.raissi.service.ResumeService;

@Named("resumeService")
@Transactional
public class ResumeServiceImpl implements ResumeService, Serializable{
 private static final long serialVersionUID = 1L;
 
 @Inject
 private ResumeDao resumeDao;
 @Value("${s3.accessKeyId}")
 private String amazonAccessKeyId;
 
 @Value("${s3.secretKey}")
 private String amazonSecretKey;
 
 @Value("${s3.bucketName}")
 private String bucketName;
 
 private S3Service s3Service;
 // To store data in S3 you must first create a bucket, a container for objects.
 private S3Bucket bucket;
 @PostConstruct
 public void init(){
  try {
   //amazon S3 storage credentials:
   AWSCredentials awsCredentials = 
       new AWSCredentials(amazonAccessKeyId, amazonSecretKey);
   //To communicate with S3, create a class that implements an S3Service. 
   //We will use the REST/HTTP implementation based on HttpClient, 
   //as this is the most robust implementation provided with JetS3t.
   s3Service = new RestS3Service(awsCredentials);
   
   bucket = s3Service.getBucket(bucketName);
   if(bucket == null){
    bucket = s3Service.createBucket(bucketName);
   }
  } catch (S3ServiceException e) {
   // TODO Auto-generated catch block
   e.printStackTrace();
  }
 }

        @Override
 public void persistCvContent(InputStream content, Resume resume) {
  long currentTime = System.currentTimeMillis();
  String extension = resume.getDocumentName().substring(resume.getDocumentName().lastIndexOf("."));
  String fileName = currentTime+extension;
  
  try {
   byte[] contentArray = IOUtils.toByteArray(content);
   S3Object cvS3Object = new S3Object(fileName, contentArray);
   cvS3Object.setContentLength(contentArray.length);
   cvS3Object.setContentType("application/pdf");
   s3Service.putObject(bucket, cvS3Object);
   resume.setContentUrl(bucketName+"/"+fileName);
  } catch (S3ServiceException e1) {
   // TODO Auto-generated catch block
   e1.printStackTrace();
  } catch (NoSuchAlgorithmException e) {
   // TODO Auto-generated catch block
   e.printStackTrace();
  } catch (IOException e) {
   // TODO Auto-generated catch block
   e.printStackTrace();
  }
 }
}
A bucket in Amazon S3 is a file container in which every uploaded file will be stored. A bucket name must be unique through all S3 users. Buckets cannot be nested. And, each bucket can contain an unlimited number of files. I decided bucket name to be configurable with accessKeyId and secretKey. That's why I am injecting their values in my ResumeService, so application admin can chose whatever he wants. (Remember the property-placeholder defined in Spring application context in last article, well these String values should be defined in the file referenced by the properties holder).
Also notice that we are saving the distant file name and the bucket name in a persistent entity "Resume". In this way, we are able to use different bucket names in the future.

4) Displaying PDF files in JSF2

To display PDF files, we can simply use 


From Primefaces. But this is for static and public files (accessible by simple URLs). For dynamic files which need some server logic before rendering we can use a custom servlet with the above primefaces component.
Use case: Assume we have a datatable displaying a list of users (User object from last article). For each row (User) we will have a button to view CV. When clicked, the button opens a popup dialog displaying the PDF file. So here is the XHTML code:



     
                 Registered users
                 
               
           
   
           
               
            
           
           
            
               
           
                  
               
              
                        
               
        
       
              
             
As you can see, the command button used to display the dialog has an action listener to call a server side method: generateUserCV(User user); the method code is:
public void generateUserCV(User user){
  setSelectedUser(user);
  if(user != null){
   InputStream file = resumeService.getCvByUser(user.getUserId());
   if(file != null){
    try {
     byte[] bytes = IOUtils.toByteArray(file);
     Map session = FacesContext.getCurrentInstance().getExternalContext().getSessionMap();
     //We are using userId when storing cv content to be possible to display multiple files 
     session.put(ResumeService.ATTR_RESUME+user.getUserId(), bytes);
    } catch (IOException e) {
     // TODO Auto-generated catch block
     e.printStackTrace();
    }
    
   }
  }
}
As for the generateUserCV method, it calls the resumeService to get user's CV file as java.io.InputStream. Then, we use org.apache.commons.io.IOUtils#toByteArray(InputStream input) to convert file content a byte array. After that we save the array in the session map under a constant String (ResumeService.ATTR_RESUME) of our choice concatenated with the user id. This will be used by the mentioned custom servlet later. Now here is the code of resumeService.getCvByUser:
public InputStream getCvByUser(Long userId){
  Resume resume = resumeDao.getResumeByUser(userId);
  String fullS3Name = resume.getContentUrl();
                //Remember, we set the bucket name and file name in the contentUrl property of Resume
  String bucketNameForResume = fullS3Name.substring(0, fullS3Name.indexOf("/"));
  String fileName = fullS3Name.substring(fullS3Name.indexOf("/")+1);
  try {

   S3Object cvS3Object = s3Service.getObject(bucketNameForResume, fileName);
   if(cvS3Object != null){
    InputStream stream = cvS3Object.getDataInputStream();    
    return stream;
   }
  } catch (S3ServiceException e) {
   // TODO Auto-generated catch block
   e.printStackTrace();
  } catch (ServiceException e) {
   // TODO Auto-generated catch block
   e.printStackTrace();
  }
  
  return null;
 }
In the dialog to be shown, you may have noticed that we are using p:media with a stream value: /file/cv?id=#{adminHomeManagedBean.selectedUser.userId}. Well, here we are referring to a GET call to a servlet with a param named "id". And here is the servlet implementation:
package com.raissi.servlet;

import java.io.IOException;

import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;

import com.raissi.service.ResumeService;

@WebServlet("/file/cv")
public class ResumeServlet  extends HttpServlet {
 private static final long serialVersionUID = -221600603615879137L;

 @Override
    protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
  String userId = request.getParameter("id");
  if(userId != null && !userId.equals("")){
         byte[] content = (byte[]) request.getSession().getAttribute(ResumeService.ATTR_RESUME+userId);
            if(content != null){
          response.setContentType("application/pdf");
          response.setContentLength(content.length);
          response.getOutputStream().write(content);
            }
  }
    }

}
The code is very obvious. At first, we fetch the user id (user of whom we are displaying CV) from request parameters. After that, we access the HTTPSession to retrieve the stored content and we set it into response output. Finally, we need to clean our session when dialog is closed. Otherwise, session will be encumbered with content. That's why I added a listener to be called on dialog close event: . And here is the listener code:
public void removeUserCVFromSession(CloseEvent event){
  if(selectedUser != null){
   Map session = FacesContext.getCurrentInstance().getExternalContext().getSessionMap();
   session.remove(ResumeService.ATTR_RESUME+selectedUser.getUserId());
  }
 }
And that's all you need.