comic

Comics is a medium used to express ideas via images, often combined with text or other visual information. read more at WikiPedia

  • An example driven presentation on how we leverage custom Docker containers (maven, npm, oc client) to handle build dependencies on local workstations, provide clean Jenkins slaves, run tests inside orchestrated deployments or run tests inside OpenShift projects. By Matthias Bertschy from sicpa.

    Slides

    Recorded by me at Docker Meetup group 5 Oct. 2016

  • Resources such as JavaScript and CSS files can be compressed before being sent to the browser, improving network efficiencies and application load time in certain case. If you are not using Apache with mod_deflate or nginx in front of your web application, you may need to implement resources compression yourself….

    Wait! don’t start writing your own filter to compress files like CSS, html, txt, javascript it is way more difficult than you think to properly handle http response headers and do proper handling of mime type and caching. In one sentence don’t start reinventing the wheel: use ehcache for example.

    Ehcache is an open source, standards-based cache used to boost performance, offload the database and simplify scalability. Ehcache is robust, proven and full-featured and this has made it the most widely-used Java-based cache. It can scale from in-process with one or more nodes through to a mixed in-process/out-of-process configuration with terabyte-sized caches. For applications needing a coherent distributed cache, Ehcache uses the open source Terracotta Sever Array.

    in the pom.xml of your project add the following dependency to ehcache-web

    <dependency>
        <groupId>net.sf.ehcache</groupId>
        <artifactId>ehcache-web</artifactId>
        <version>2.0.4</version>
    </dependency>

    in your web.xml, add a filter and configure it properly

    <filter>
     <filter-name>CompressionFilter</filter-name>
     <filter-class>net.sf.ehcache.constructs.web.filter.GzipFilter</filter-class>
    </filter>
    <filter-mapping>
     <filter-name>CompressionFilter</filter-name>
     <url-pattern>*.css</url-pattern>
    </filter-mapping>
    <filter-mapping>
     <filter-name>CompressionFilter</filter-name>
     <url-pattern>*.html</url-pattern>
    </filter-mapping>
    <filter-mapping>
     <filter-name>CompressionFilter</filter-name>
     <url-pattern>*.js</url-pattern>
    </filter-mapping>
    <filter-mapping>
     <filter-name>CompressionFilter</filter-name>
     <url-pattern>/*</url-pattern>
    </filter-mapping>

    Read more at EhCache Web Caching page.

    As a bonus, I provide you also below the configuration for the famous challenger HTTP server nginx

     ##
     # Gzip Settings
     ##
     gzip  on;
     gzip_http_version 1.1;
     gzip_vary on;
     gzip_comp_level 6;
     gzip_proxied any;
     gzip_types text/plain text/css application/json application/x-javascript \
    text/xml application/xml application/xml+rss text/javascript \
    application/javascript text/x-js; gzip_buffers 16 8k; gzip_disable "MSIE [1-6]\.(?!.*SV1)";

    &160;

    or for the number one HTTP server Apache using mod deflate /etc/apache2/conf.d/deflate.conf

    <Location />
    # Insert filter
    SetOutputFilter DEFLATE
    
    AddOutputFilterByType DEFLATE text/plain
    AddOutputFilterByType DEFLATE text/xml
    AddOutputFilterByType DEFLATE application/xhtml+xml
    AddOutputFilterByType DEFLATE text/css
    AddOutputFilterByType DEFLATE application/xml
    AddOutputFilterByType DEFLATE image/svg+xml
    AddOutputFilterByType DEFLATE application/rss+xml
    AddOutputFilterByType DEFLATE application/atom_xml
    AddOutputFilterByType DEFLATE application/x-javascript
    AddOutputFilterByType DEFLATE text/html
    
    # Netscape 4.x has some problems...
    BrowserMatch ^Mozilla/4 gzip-only-text/html
    
    # Netscape 4.06-4.08 have some more problems
    BrowserMatch ^Mozilla/4\.0[678] no-gzip
    
    # MSIE masquerades as Netscape, but it is fine
    BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
    # Don't compress images
    SetEnvIfNoCase Request_URI \
    \.(?:gif|jpe?g|png)$ no-gzip dont-vary
    
    # Make sure proxies don't deliver the wrong content
    Header append Vary User-Agent env=!dont-vary
    </Location>


  • Apple TV  is a set-top box manufactured by Apple. The Apple TV name and start of online sales was announced by Steve Jobs during the keynote speech at the 2007 Macworld Expo in San Francisco on January 9, 2007. The Apple TV is enabled to stream digital content from any computer running Mac OS X or Microsoft Windows running iTunes onto an enhanced-definition or high-definition widescreen television.
    Offical Site Apple TV
     

  • apache_maven

    This issue has turn me upside down a long time. In fact in the official Google Group http://groups.google.de/group/maven-for-php/ I was not the only one to have this issue.

    I did try the following, and it is always good to check first

    • Checking PHP version,
    • Starting Maven with -X for having more debug information
    • Testing it in Eclipse + M2Eclipse on windows, was working there,
    • Comparing calculated PHP include path on Windows and Linux: they were identical in this case

    Only my Linux box was not working. . .(http://teamcity.waltercedric.com)

    After that I did materialize the eclipse project of maven-php-plugin and even built a custom version that I’ve deployed without any effort to my Artifactory (http://maven.waltercedric.com)

    And what is the solution?

    it my server configuration and paranoia :-)

    open your php.ini, ideally the right one, don’t put your server at risks: You may have many under Linux, especially if you use plesk or cpanel 

    • cli at /etc/php5/cli/php.ini
    • apache2, /etc/php5/apache2/php.ini
    • fastcgi at /etc/php5/fastcgi/php.ini

    location most of the time

    /etc/php5/cli/php.ini

    and add the directory where your build server make a checkout...

    ; open_basedir, if set, limits all file operations to the defined directory
    ; and below.  This directive makes most sense if used in a per-directory
    ; or per-virtualhost web server configuration file. This directive is
    ; *NOT* affected by whether Safe Mode is turned On or Off.
    open_basedir = /www/vhosts:/tmp:/xxxx/yyyy/

    Next step is to put Joomla! 1.6 and all their PHPUnit tests a run along with Selenium. May also need to  patch Maven for PHP to better support Tests reporting like Surefire.

  • funny.manager

  • apache_maven

    Packt publishing has offered me to make a review of two of their new books (Thanks to them). I should receive free samples for review beginning of next week. Since it is two of my favorite subjects (Maven and Joomla!® ), I think it may also interest you. A review will follow in some days.

    ApacheMaven2EffectiveImplementation  

    Build and Manage Applications with Maven, Continuum, and Archiva

    • Install Apache Maven and follow the sample application to build up your project as quickly as possible
    • Test your applications to ensure maximum stability using Maven's inbuilt tools
    • Use Maven's report and checking tools to ensure the health of your projects
    • Explore Apache Continuum which will help you to ensure the health of your source code
    • Improve your team builds with the powerful combination of Maven, Archiva and Continuum
    • Install and run the repository manager Apache Archiva
    •  

     

    Joomla!1.5DevelopmentCookbook

    Joomla!® 1.5 Development Cookbook 

    • Make your extensions extensible, add extensions points to allow third parties to customize your extension
    • Create international extensions by enabling multilingual capabilities
    • Build more than just HTML pages - create PDF documents, Atom Feeds, and more!
    • Improve the user experience by adding Ajax
    • Create Atom and RSS feeds to keep users up-to-date
    • Utilize the power of Subversion to maintain your source code
    • Execute database queries and handle returned data in order to access and modify your data
    • Dynamically extend your database tables using JParameter to make your extensions more flexible
    • Keep your gremlins at bay by handling errors the Joomla! way
    • Work with the file system, interrogate existing files and folders and store data in the file system
    • Take control of your workflows by using www.JoomlaCode.org to manage your Joomla! projects
  • eclipse&160;

    A very little trick that allow you to quickly run any operation involving a DOS command on an Eclipse project. Go to the external launcher, and create a new configuration.

    This trick may be useful for running your Maven set of command without any dependencies to M2Eclipse.

    &160;

    &160;

    Location:&160; {ENV_VAR:COMSPEC}
    Working Directory: {project_loc}

    eclipse.open.command.line

    {ENV_VAR}

    Returns the value of an environment variable. An environment variable name must be specified as an argument.

    command.line.prompt

  • apache_maven

    We had serious performance problems with MAVEN in our environment. It seems to be a recurrent problem
    for MAVEN... anyway I did came through the following changes...the 2.0.9.db1 Maven2 patch make really
    Maven fly!

    General settings to speed up Maven:

    • More memory for Maven process, change the launcher of eclipse to set MAVEN_OPTS like this:
      -DMAVEN_OPTS="-Xms64m –Xmx128m"
    • Use the latest version of Maven, but be careful of regressions! the latest as for today is 2.0.9
    • There is a patch available for Maven 2.0.9, which speed up build by 40%. It is just simply day and
      night! try it, you'll love it! Basically Don Brown alter MAVEN2 2.0.9 to

    General settings to speed up Eclipse:

    1. Use javaw.exe to start eclipse and not java.exe (more for console base program with a lot of feedback),
      while javaw.exe is more for graphical environment.
    2. Aggressive JIT and double core processors should use:  
       -XX:-UseParallelGC -XX:+AggressiveOpts -XX:-UseConcMarkSweepGC -XX:+UseFastAccessorMethods
    3. Give more memory, MORE MEMORY for eclipse, on a 4GB machine, these are my settings: 
      -Xms768m -Xmx1024m -XX:MaxPermSize=256m
    4. Reduce the number of warning reported by eclipse per compilation unit (class), default is 100, reduce it to 10.
      It help nobody to see a workspace slowing down because of too many warning logging.
      Remove the warnings instead ;-)
    5. SVN console with subversive is too verbose as default, go to eclipse preferences - Team – SVN - Console.
      Logging SVN errors should be enough.
    6. Use a Defragmenter! NTFS fragment fast with so many small files in workspace, every 2 week is a good practice.
    7. I am using Java 1.6u10 (BETA!) and have experience no crash till now,
      being on the edge can be costly in time through. Maven forking should benefit from the reduce java kernel
      size and bootstrap time
  • When working with many feature/release/bugix/hotfix branches, it is a bad idea to start changing the pom version as this will create merge conflicts using pull request. this plugin allow you to keep in ALL branches the same pom version for all your projects, for example MASTER-SNAPSHOT the version will be derived from branch name automagically :-)

    You may want to read more first these 2 short articles

    git-branch-renamer-maven-plugin allow you to keep in ALL branches the same pom version for all your projects: for example MASTER-SNAPSHOT and never change it again.

    the project version will be derived from branch name automatically when running in your continuous integration server.

    branch name feature/xxxx

    • <version>xxxx-SNAPSHOT</version> (default)
    • <version>xxxx</version> (release = true)
    • <version>0-xxxx-SNAPSHOT</version> (forceNumericalVersion = true)
    • <version>feature-xxxx-SNAPSHOT</version> (filterOutBranchQualifier = false)

    The project is hosted at Github https://github.com/cedricwalter/git-branch-renamer-maven-plugin 

  • From http://www.bunniestudios.com/wordpress/?p=74 the man wo break the first XBOX.

    At any rate, some very interesting things are afoot. Much of it stems from the discovery of an all-media bootable kiosk demo disk. Many hackers will instantly recognize the value of this, but it’s still interesting to reflect on the significance of this find. Like the original Xbox, the Xbox360 uses a media flag on its executables.

    The media flag tells the OS what type of media it should be on; typically, games are released with the flag set to Microsoft’s proprietary secure Xbox DVD format (which is in itself not that secure…). Significantly, only the executable is signed for a game; the data sections typically are not signed (presumably for performance reasons). Thus, one has the ability to fuzz the executable by corrupting the data sections, potentially invoking a buffer overrun or some other unintentional behavior–if one could effectively modify the data sections. Remember that this is normally not possible, since modifying the data segment requires making a copy to a writeable media, and this contradicts the signed media flag.

    Thus, the run-anywhere demo disk now enables software hackers to create and test the interaction of signed executables with modified game data using no tool other than a DVD-RW drive (and an Xbox360 console, still considerably rare and difficult to obtain in the US). Some of the more interesting modifiable data regions include Shockwave Flash movies, and the pixel shaders executed by the GPU (more info can be found on the xboxhacker.net website). Of particular interest is the MEMEXPORT shader command in the 360, which could enable people to dump physical memory to the screen (where it can be digitized or extracted with a sniffer upstream of the ANA chip), or to some other peripheral function. Presuming plaintext kernel code can be extracted this way, it bootstraps further efforts in vulnerability analysis of the code running in the Xbox…and so forth. Of course, its quite possible that this hole is plugged, since Microsoft’s NGSCB spec calls for the Northbridge to limit DMA access from the graphics card to main memory. Furthermore, buffer overrun exploits have questionable applicability since each process runs as its own virtual machine and rumors has it that the no-execute bit is used on heap space. Still, I’m very surprised that such a media was even released into the wild by Microsoft…their own worst enemy is their own haste to get to the market and carelessness; security is for naught without consideration of human factors. Very exciting! Perhaps the Xbox360 will be opened without the need for significant hardware hacking.
  • Hackers have modified an estimated 150,000 of the 9 million Xboxes Microsoft has sold worldwide to turn them into PCs that would normally cost $800 or more. What you need:
    1. A new Xbox, with 733-megahertz processor, custom graphics chip, 8-gigabyte hard drive: $149
    2. A keyboard and mouse, with adapters: $35
    3. A modified start-up chip: $21
    4. 120-gigabyte hard drive: $120
    5. Linux ans some free software: $0
    Total: $325
    Source: USA TODAY research
  • I will try to keep an history of all my previous machine on this page...yes it is just a nerd/geek behavior...

    geek-inside

    In 15 years...

    • Cost in euro are in green, are similar in amount, you just get more size, disk, speed for your buck
    • 1208 times more IPS and my PC is still sometimes hanging (In order Vista, XP, Linux I look at you) :-(
    • 81 times more disk space!
    • Power consumption 360 watts today :-(
    • PC are still slower than what I expect...

     

    Instructions per second (IPS) is a measure of a computer's processor speed. Many reported IPS
    values have represented "peak" execution rates on artificial instruction sequences with few branches,
    whereas realistic workloads consist of a mix of instructions and even applications, some of which take
    longer to execute than others. The performance of the memory hierarchy also greatly affects processor
    performance, an issue barely considered in MIPS calculations. Because of these problems, researchers
    created standardized tests such as SPECint to (maybe) measure the real effective performance in
    commonly used applications, and raw IPS has fallen into disuse. [WikiPedia]

  • apache_maven

    I was fighting today against the maven-release-plugin of maven, solving complicated  errors in a row. As I am convince I made all possible errors,   I think it is worse to compile my findings here to help others :-)

    Maven Release Plugin

    This plugin is used to release a project with Maven, saving a lot of repetitive, manual work. Releasing a project is made in two steps: prepare and perform.

    My approach to speed up things is always to define a small project (in a sandbox SVN root) that is compiling and running in 10 seconds to make some test before trying to make it run on our bigger Innoveo Skye(tm) product (35 modules)

    I always have 2 projects prepared:

    • One TestSimpleProject: one Maven project with no code
    • One TestComplexProject: one maven project and 2 Maven sub modules

    For the reader that can not wait here is the running command line from TeamCity to be put in Build Runner Goals

    release:clean release:prepare release:perform -Dusername=xxxxxxx -Dpassword=yyyyyy

     

    Latest SVN client is  recommended

    You need the latest SVN command line client on all TeamCity  agent, or  at least not any SVN command line client > 1.5.x which don’t work (but 1.5.0 would have). We were using 1.5.1 of course on all our server (Murphy’s laws)

    Use at least a Subversion SVN client (1.6.6 as for today).

    If you dont have any SVN command line installed in your TeamCity agents, you’'ll  end with this easy to understand error

    [INFO] Unable to check for local modifications
    [11:34:40]:
    Provider message:
    [11:34:40]:
    The svn command failed.
    [11:34:40]:
    Command output:
    [11:34:40]:
    /bin/sh: svn: command not found
    [11:34:40]:
    [INFO] Trace
    [11:34:40]:
    org.apache.maven.BuildFailureException: Unable to check for local modifications
    [11:34:40]:
    Provider message:
    [11:34:40]:
    The svn command failed.
    [11:34:40]:
    Command output:
    [11:34:40]:
    /bin/sh: svn: command not found
    [11:34:40]:
    at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:715)
    [11:34:40]:
    at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandaloneGoal(DefaultLifecycleExecutor.java:569)
    [11:34:40]: at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(DefaultLifecycleExecutor.java:539)

    The  maven release plugin require SVN client to be installed.

    Update maven-release-plugin to the latest

    You need to update maven-release-plugin away from 2.0.beta9 to 2.0 to solve the issue with multi module release. Luckily for me the 2.0 is available since 10 February 2010 Older version were working for simple Maven project (a project with no Maven module)  but not with Multi Modules Projects!

    With 2.0-beta9 I was getting this error in Multi Modules Projects!

    [18:41:46]:[ERROR] BUILD FAILURE
    [18:41:46]: [INFO] ------------------------------------------------------------------------
    [18:41:46]: [INFO] Unable to tag SCM
    [18:41:46]: Provider message:
    [18:41:46]: The svn tag command failed.
    [18:41:46]: Command output:
    [18:41:46]: svn: Commit failed (details follow):
    [18:41:46]: svn: File '/svn/xxxxx/skye/tags/skye-2.1.0.M8/skye-admin/pom.xml' already exists
    [18:41:46]: [INFO] Trace
    [18:41:46]: org.apache.maven.BuildFailureException: Unable to tag SCM
    [18:41:46]: Provider message:
    [18:41:46]:
    The svn tag command failed.
    [18:41:46]: Command output:
    [18:41:46]: svn: Commit failed (details follow):
    [18:41:46]: svn: File '/svn/xxxxx/skye/tags/skye-2.1.0.M8/skye-admin/pom.xml' already exists

    Invalid certificate handling

    Sometimes an issue if you craft some certificate yourself, You need to import certificate in each TeamCity build agent by running

    # svn co https://svn.xxxxxx.com/svn/yyyyyy/skye

    at least once and accepting permanently the certificate (break the command afterward) Do this with the same UNIX user under which the agent run or you’ll always get this error:

    [11:52:11]:[ERROR] BUILD FAILURE
    [11:52:11]:
    [INFO] ------------------------------------------------------------------------
    [11:52:11]:
    [INFO] Unable to checkout from SCM
    [11:52:11]:
    Provider message:
    [11:52:11]:
    The svn command failed.
    [11:52:11]:
    Command output:
    [11:52:11]:
    svn: OPTIONS of 'https://xxxxx.: Server certificate verification failed: certificate issued
                      for a different hostname, issuer is not trusted (xxxxxxxxx)
    [11:52:11]:
    [INFO] Trace
    [11:52:11]:
    org.apache.maven.BuildFailureException: Unable to checkout from SCM
    [11:52:11]:
    Provider message:
    [11:52:11]:
    The svn command failed.
    [11:52:11]: Command output:

    Maven 2.2.1 wrongly calculate SCM commit URL

    You can not use in <scm> tag this kind of URL’s https://user:This email address is being protected from spambots. You need JavaScript enabled to view it.  like in the example below:

    <scm>
    <connection>scm:svn:https://username:This email address is being protected from spambots. You need JavaScript enabled to view it./svn/yyyyy/skye/trunk/skye</connection>
    <developerConnection>scm:svn:https://username:This email address is being protected from spambots. You need JavaScript enabled to view it./svn/yyyyy/skye/trunk/skye</developerConnection>
    <url>scm:svn:https://username:This email address is being protected from spambots. You need JavaScript enabled to view it./svn/yyyyy/skye/trunk/skye</url>
    </scm>

    Even if the documentation state otherwise, as the maven-release-plugin go “crazy” by concatenating wrongly the tagging URL.

    [17:32:47]: [INFO] Working directory: /home/agent/buildagent/work/3d299c4b925af39b/TestRelease
    [17:32:47]: [INFO] ------------------------------------------------------------------------
    [17:32:47]:
    [ERROR] BUILD FAILURE
    [17:32:47]:
    [INFO] ------------------------------------------------------------------------
    [17:32:47]:
    [INFO] Unable to tag SCM
    [17:32:47]:
    Provider message:
    [17:32:47]:
    The svn tag command failed.
    [17:32:47]:
    Command output:
    [17:32:47]:
    svn: Source and dest appear not to be in the same repository
                         (src: 'https://svn.xxxxxx.com/svn/xxxxxxx/Sandbox/trunk';
                         dst: 'https://xxxxxxx:This email address is being protected from spambots. You need JavaScript enabled to view it./svn/xxxxxx/Sandbox/tags/TestRelease-0.0.11')
    [17:32:47]: [INFO] ------------------------------------------------------------------------

    I found a workaround by adding environment variables in the TeamCity build in the list of Maven Goals:                 

    –Dusername=xxxx –Dpassword=yyyy

    Beware of invalid SCM URL

    SCM (Software Configuration Management, also called Source Code/Control Management or, succinctly, version control) is an integral part of any healthy project. If your Maven project uses an SCM system (it does, doesn't it?) then here is where you would place that information into the POM.

    A lot of example are floating around in internet about <scm> values that look like this:

    <scm>
    <connection>scm:svn:https://svn.xxxxx.com/svn/yyyyy/skye/trunk</connection>
    <developerConnection>scm:svn:https://svn.xxxxx.com/svn/yyyyy/skye/trunk/</developerConnection>
    <url>scm:svn:https://svn.xxxx.com/svn/yyyyy/skye/trunk/skye</url>
    </scm>

    With the above, you’ll end up tagging your whole trunk under a new tags in https://svn.xxxxx.com/svn/yyyyy/skye/tags/skye-2.1.0

    No one is saying that you' should better have this, end up your scm connection with the project you would like to tag

    <scm>
    <connection>scm:svn:https://svn.xxxxx.com/svn/yyyyy/skye/trunk/skye</connection>
    <developerConnection>scm:svn:https://svn.xxxx.com/svn/yyyyy/skye/trunk/skye</developerConnection>
    <url>scm:svn:https://svn.xxxx.com/svn/yyyyy/skye/trunk/skye</url>
    </scm>

    Failure to deploy newly artifact

    This one is also irritating, because running a build in TeamCity with the goal : deploy run perfectly, the same build in prepare:release perform:release failed miserably at the end with

    INFO] [ERROR] BUILD ERROR
    [19:26:08]: [INFO] [INFO] ------------------------------------------------------------------------
    [19:26:08]: [INFO] [INFO] Error deploying artifact: Failed to transfer file: http://artifactory.xxxxxx.com:/libs-releases-local/…
                                           . Return code is: 401[19:26:08]:
    [ERROR] BUILD ERROR
    [19:26:08]:
    [INFO] ------------------------------------------------------------------------
    [19:26:08]:
    [INFO] Maven execution failed, exit code: '1'
    [19:26:08]: [INFO] ------------------------------------------------------------------------
    [19:26:08]:
    [INFO] Trace
    [19:26:08]:
    org.apache.maven.lifecycle.LifecycleExecutionException: Maven execution failed, exit code: '1'
    [19:26:08]:
    at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:719)
    [19:26:08]: at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandaloneGoal(DefaultLifecycleExecutor.java:569)

    I was not able to find a workaround to this one, the build is running without the maven-release-plugin and deploy correctly to artifactory! But I managed to get around it by using in place of the stable Maven 2.2.1 the latest version 3.0.alpha7!!!!

    I hope this post will help some of you.

  • No Xbox You can modded, but still wanting to use this beautiful piece of software: XBMC ?

    You may take a look at the Media Portal
    "Media Portal turns your PC in a very advanced Multi MediaCenter / HTPC.

    It allows you to listen to your favorite music & radio, watch your video's and DVD's, view, schedule and record live TV and much more. You get Media Portal for free/nothing/nada/noppes and best of all it is opensource. This means anyone can help developing Media Portal or tweak it for their own needs!"

    It is using a regular PC architecture and has more possibilities like MPG recording, TV (PVR). It cost also a lot more too, but it is still less expensive than M$ media center 2005 and is not cripled with Digital Right Management (DRM [wikipedia]) technology.

    Project homepage of mediaportal

    {mosgoogle center}

  • Introductiontom2eclipse

    The Eclipse IDE is the most widely used IDE for Java development today. At the same time, Apache Maven continues to mature, and has grown to be the industry standard for creating extensible and reliable enterprise builds. While it is certainly possible to develop applications in Eclipse and use Maven as a command-line build tool, most developers expect the IDE to know how to invoke and interact with the build tool they are using.

    Enter m2eclipse. Them2eclipse project provides support for Maven within the Eclipse IDE. It is a plugin which helps bridge the gaps between Maven and Eclipse. Using m2eclipse you can develop a large multi-module project with nested Maven modules and have this hierarchical structure reflected in your Eclipse IDE. Using m2eclipse, you can launch and manage your project's Maven build using editors, and your IDE will become aware of both the local and remote Maven repositories allowing you to quickly search for and locate any artifact made available in the Maven repository. m2eclipse will also change the way you create projects with a novel and easy-to-use interface for creating projects from Maven Archetypes.

    In this article, we will explore the features m2eclipse provides and help you start using an Eclipse plugin which provides real Maven integration for the best IDE platform available. After reading this article you should have enough information to install the m2eclipse plugin and start creating or importing existing Maven projects into your Eclipse workspace. You will also have an idea of some of the features provided the plugin. read more at ServerSide

  • apache_maven

    Static analysis is in the verification of properties of software used in safety-critical computer systems and locating potentially vulnerable/buggy code. it is desirable to make your build fails at compile/test phases to detect faults earlier. Thanks to JSFUnit and Maven, you’ll be able to plug a JSF checker in your build with no effort.

    JSFUnit is a test framework for JSF applications. It is designed to allow complete integration testing and unit testing of JSF applications using a simplified API. JSFUnit tests run inside the container, which provides the developer full access to managed beans, the FacesContext, EL Expressions, and the internal JSF component tree. At the same time, you also have access to parsed HTML output of each client request.

    JSFUnit provides a set of unit tests for static analysis of JSF applications. Compare to JSFUnit, you can run these tests without any container, in Maven phase “test” like any regular Unit Test

    Views tests (JSFUnitStaticAnalysisViewTest.java)

    • Do any of your facelets templates or well formed JSPs reference nonexistent managed beans?
    • Do any of your templates or JSPs have EL expressions for nonexistent managed bean actions or action listeners?

    Faces-configurations tests (JSFUnitStaticAnalysisFacesConfigTest.java)

    • Are all of your session and application scoped beans Serializable?
    • Invalid Managed Bean Scope?
    • Missing Managed Bean Class?
    • Faces Configurations Class Inheritance?
    • Missing Setter Methods?
    • Duplicate Managed Bean Names?

    TLD tests (JSFUnitStaticAnalysisFacesConfigTest.java)

    • Correct Tag Attribute Types?
    • Unique Tag Names?
    • Correct Tag Inheritance?
    • Unique Tag Attributes?

    Install

    Put the following in your web project pom.xml (the pom.xml with <packaging>war</packaging>) between <dependencies> .. </dependencies>, Note that this dependency is only available in scope “test”

    <dependencies>
        <dependency>
            <groupId>org.jboss.jsfunit</groupId>
            <artifactId>jboss-jsfunit-analysis</artifactId>
            <version>1.0.0.GA</version>
            <scope>test</scope>
        </dependency>
        <!-- TLD test  dependencies  below, for
             View and facesConfig not needed-->
        <dependency>
            <groupId>javax.servlet.jsp</groupId>
            <artifactId>jsp-api</artifactId>
            <version>2.1</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>commons-logging</groupId>
            <artifactId>commons-logging</artifactId>
            <version>1.1.1</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>maven-taglib</groupId>
            <artifactId>maven-taglib-plugin</artifactId>
            <version>1.4.2</version>
            <scope>test</scope>
        </dependency>
    </dependencies>

    Add the following repository in your pom, settings.xml or your Maven proxy repository (Artifactory for example).

    <Repositories>
        <Repository>
            <id>jboss</id>
            <url>http://repository.jboss.com/maven2/</url>
        </Repository>
    </Repositories>
    and
    <PluginRepositories>
        <PluginRepository>
            <id>taglib</id>
            <url>http://maven-taglib.sourceforge.net/maven</url>
        </PluginRepository>
    </PluginRepositories>

    Create now 3 test classes in /src/test/java

    JSFUnitStaticAnalysisViewTest.java

    package com.waltercedric.jsfunit;
    
    import java.util.HashSet;
    import java.util.Set;
    import org.jboss.jsfunit.analysis.AbstractViewTestCase;
    
    public class JSFUnitStaticAnalysisViewTest extends AbstractViewTestCase {
      private static Set absoluteViewPaths = new HashSet<String>() {
        {
          // add("C:/work/project/src/home.xhtml");
        }
      };
      private static Set recursiveViewPaths = new HashSet<String>() {
        {
          add("src/main/resources/pages");
          add("src/main/resources/bottom");
          add("src/main/resources/top");
          add("src/main/resources/menu");
        }
      };
      public JSFUnitStaticAnalysisViewTest() {
        super(absoluteViewPaths, recursiveViewPaths,
        "src/main/resources/META-INF/faces-config.xml");
      }
    }

    JSFUnitStaticAnalysisFacesConfigTest.java

    package com.waltercedric.jsfunit;
    
    import java.util.HashSet;
    import java.util.Set;
    import org.jboss.jsfunit.analysis.AbstractFacesConfigTestCase;
    
    public class JSFUnitStaticAnalysisFacesConfigTest extends AbstractFacesConfigTestCase {
      private static Set<String> paths = new HashSet<String>() {
        {
          add("src/main/resources/META-INF/faces-config.xml");
        }
      };
      public JSFUnitStaticAnalysisFacesConfigTest() {
        super(paths);
      }
    }

    JSFUnitStaticAnalysisTldTest.java

    package com.waltercedric.jsfunit;
    
    import java.util.HashSet;
    import java.util.Set;
    import org.jboss.jsfunit.analysis.AbstractTldTestCase;
    
    public class JSFUnitStaticAnalysisTldTest extends AbstractTldTestCase {
      private static Set<String> paths = new HashSet<String>() {
        {
          add("src/main/resources/META-INF/facelets.core.taglib.xml");
        }
      };
      public TldTestCase() {
        super(paths);
      }
    }
     

    References

  • apache_maven

    It is not unusual in a project to have a huge number of third party artifacts and Plug-in. Apache Maven help you keep track of them, along with their transitive dependencies.

    But how do you know when a new version of an artifact is available?  This is where the Maven Versions plug-in come hand in.

    The Versions Plug-in is used when you want to manage the versions of artifacts in a project's POM.

    By running

    mvn versions:display-dependency-updates

    in any Apache Maven project or modules, you’ll get for example (we have a lot of 25 Maven modules, here is only one presented as an example, the list being too long)

    [INFO] --------------------------------------------------------------------------------------------------
    [INFO] Building Unnamed - com.innoveo:skye-services-api:jar:2.2.0-M-06
    [INFO] --------------------------------------------------------------------------------------------------
    [INFO]
    [INFO] The following dependencies in Dependency Management have newer versions:
    [INFO]   junit:junit............................................. 4.4 -> 4.8.1
    [INFO]   log4j:log4j......................................... 1.2.15 -> 1.2.16
    [INFO]   org.springframework:spring...................... 2.5.6 -> 2.5.6.SEC02
    [INFO]   org.springframework:spring-test............... 2.5.6 -> 3.0.4.RELEASE

    Attention:

    It is not always an easy task to update some core components or 3rd party libraries in a complex software, as it may introduce some regressions, incompatibilities..

    At least thanks to this Versions plug in, you are aware that they may be something newer to try. What this plug in do not report is why you may want to update some artifacts libraries:

    • Do I have to use the latest version x.y.z because of  security issues?
    • Will i get more performances by updating to x.y.z?
    • New Version x.y.z resolve bug xxxx, will I have other annoying issues?

    In all the above case, you are on your own, but this is not the scope of this plug in. You’ll have anyway to

    1. Carefully decide which library can be updated,
    2. Match it to your software roadmap,
    3. Have enough confidence in your test suite (unit test, BDD, integration tests) and testing team,
    4. Communicate with your customer (for security issues in 3rd party library)
    5. .. and the list goes on

    The Versions Plug-in has a lot of interesting goals.

    Some are also updating values across all pom.xml for you.

    • versions:update-parent updates the parent section of a project so that it references the newest available version. For example, if you use a corporate root POM, this goal can be helpful if you need to ensure you are using the latest version of the corporate root POM.
    • versions:update-properties updates properties defined in a project so that they correspond to the latest available version of specific dependencies. This can be useful if a suite of dependencies must all be locked to one version.
    • versions:update-child-modules updates the parent section of the child modules of a project so the version matches the version of the current project. For example, if you have an aggregator pom that is also the parent for the projects that it aggregates and the children and parent versions get out of sync, this mojo can help fix the versions of the child modules. (Note you may need to invoke Maven with the -N option in order to run this goal if your project is broken so badly that it cannot build because of the version mis-match).
    • versions:lock-snapshots searches the pom for all -SNAPSHOT versions and replaces them with the current timestamp version of that -SNAPSHOT, e.g. -20090327.172306-4
    • versions:unlock-snapshots searches the pom for all timestamp locked snapshot versions and replaces them with -SNAPSHOT.
    • versions:resolve-ranges finds dependencies using version ranges and resolves the range to the specific version being used.
    • versions:set can be used to set the project version from the command line.
    • versions:use-releases searches the pom for all -SNAPSHOT versions which have been released and replaces them with the corresponding release version.
    • versions:use-next-releases searches the pom for all non-SNAPSHOT versions which have been a newer release and replaces them with the next release version.
    • versions:use-latest-releases searches the pom for all non-SNAPSHOT versions which have been a newer release and replaces them with the latest release version.
    • versions:use-next-snapshots searches the pom for all non-SNAPSHOT versions which have been a newer -SNAPSHOT version and replaces them with the next -SNAPSHOT version.
    • versions:use-latest-snapshots searches the pom for all non-SNAPSHOT versions which have been a newer -SNAPSHOT version and replaces them with the latest -SNAPSHOT version.
    • versions:use-next-versions searches the pom for all versions which have been a newer version and replaces them with the next version.
    • versions:use-latest-versions searches the pom for all versions which have been a newer version and replaces them with the latest version.
    • versions:commit removes the pom.xml.versionsBackup files. Forms one half of the built-in "Poor Man's SCM".
    • versions:revert restores the pom.xml files from the pom.xml.versionsBackup files. Forms one half of the built-in "Poor Man's SCM".

    The easiest way to live dangerously is to try to update all 3rd parties in one shot by issuing

    mvn versions:use-latest-versions

    but that’s another story :-)

  • apache_maven

    The Maven Dependency Plugin among other things include a dependency:analyze-duplicate

    The dependency plugin provides the capability to manipulate artifacts. It can copy and/or unpack artifacts from local or remote repositories to a specified location.

    This Apache Maven plugin is really feature rich and provide a lot of interesting goals:

  • apache_maven

    I compiled here a list of the major Apache Maven repositories (read an intro to repositories) for You, You can contact me, or post a comment if you would like to add a missing one to the list.

    And of course mine at http://maven.waltercedric.com

  • apache_maven

    These things have disturbed us (The developer Team at Innoveo.com) a lot in the past months. We did solve them recently, and I would like to publish them now here to help more people

    Someone create a new maven module, after updating from SVN the module is not visible as a separate project.

    Scenario:

    1. User A checks out a maven project from SVN using "Checkout as Maven Project". All modules are now listed as separate projects in Eclipse
    2. User B creates a new module in the project, and checks it into SVN
    3. User A updates project
    4. New module does not show up as a separate project.

    Solution:

    1. Work around found by selecting the parent project
    2. Do Import -> Maven.
    3. Select then the same name template and most modules should be grayed out because of name conflicts. But you should see the missing module in the list
    4. Tick the one (new) module and import it.

    In SVN perspective, when I choose "Checkout as Maven Project" on maven projectA, I get an exception saying that maven.123457896 can not be renamed

    Scenario:

    This error occur sometime, especially if you ever kill eclipse during a previous maven checkout (as sometimes it seems to hang forever). In fact the error message can be misleading,as M2Eclipse can not renamed maven.1234567896 to projectA as it may partially exist on disk.

    Solution:

    1. Stop eclipse
    2. Go to the workspace location {workpsace_loc} 
    3. Delete the directory maven.1234567896 or any directory starting with maven.xxxxxxxx
    4. Delete also the temporary created Maven project  {workpsace_loc}\projectA directory you were trying to checkout if it exist.
    5. Restart eclipse, and in SVN perspective, on maven project A, retry and  select "Checkout as maven project"

    .classpath or .project are not committed in SVN, how to add them?

    Scenario:

    You may have add a svn:ignore on some directories, or some someone may have committed a recursive svn:ignore properties on some module in the hierarchy. While we should never commit any .classpath to SVN, there is some rare case where it is still needed. For example if you ever add special runtime server libraries that may not be coming from Maven Dependencies.

    Solution:

    Even if there is a svn:ignore on a maven module, or if a module has applied svn:ignore properties to all its child, you can always put a file under version control by doing the following

    1. Go in SVN perspective
    2. Drill down to the Maven module location or directory in which you would like to add a file
    3. Right click New... then choose File,
    4. A pop up will open letting you choose a file on disk
    5. Don’t forget to Enter a commit comment

     

    Maven Surefire runs multiple times our test case when using goal site

    This is neither a bug or an issue of Maven, it look like even to be a feature!

    Some reporting plugin are modifying (instrumentation) the java byte code of test cases like with Cobertura (goal: cobertura:cobertura).

    The Cobertura tool is a free and easy to use source code test coverage analyses. It helps you to discover where your source-code lacks in test coverage.

    In some rare case scenario (multi threaded test cases for example), it may be worth to run the code twice as instrumentation may modify behaviors and outcome of tests. So to resume, maven Surefire run them once, then Cobertura one more time but instrumented. One solution among other to escape this is to use Maven profiles, and to rely on another build that do not use reporting to run tests without instrumentation.

    More to come ..

  •  apache_maven

    'Integration testing' (sometimes called Integration and Testing, abbreviated I&T) is the activityof software testing in which individual software modules are combined and tested as a group. It occurs after unit testing and before system testing. Integration testing takes as its input modules that have been unit tested, groups them in larger aggregates, applies tests defined in an integration test plan to those aggregates, and delivers as its output the integrated system ready for system testing. [WikiPedia]

     

    I will put Selenium to that JOB. Selenium will allow me to run a set of unit test against a running instance of my application and get a feedback on the quality before delivering the software to a testing team.

    Making any Maven module Selenium enable is really easy, all you have to do is to add to the dependencies section the following

    <dependency>
        <groupId>org.openqa.selenium.client-drivers</groupId>
        <artifactId>selenium-java-client-driver</artifactId>
        <version>0.9.2</version>
        <scope>test</scope>
    </dependency>

    Now you should be able to cut and paste any test cases developed with Selenium IDE into /src/test/java/

    Selenium IDE is a Firefox add-on that records clicks, typing, and other actions to make a test, which you can play back in the browser or export to may different languages: Ruby, Python, Java, PHP, Perl, .Net, JavaScript to name a few. [Learn more]

    The java code is in no way different than regular JUNIT test cases, except that it does not use the latest JUNIT 4.x annotations. You’ll be able to run tests like before (right click Run As Junit)

    package com.waltercedric.maven;
    
    import com.thoughtworks.selenium.SeleneseTestCase; 
    
    public class TestHello extends SeleneseTestCase {
    
      public void setUp() throws Exception {
        setUp(http://localhost/helloworld, "*iexplore");
      }
    
      public void testNew() throws Exception {
        selenium.open("/helloworld/index.xhtml");
        selenium.waitForPageToLoad("30000");
    verifyTrue(selenium.isTextPresent("Are you an existing Customer")); } }

    Some explanations are needed:

    •  
      •  
    • http://localhost/helloworld is the URL of my tomcat container where my web applications will be deployed (port 80)
    • I choose internet explorer as browser as it is nearly always available on any windows pc, Firefox is not far away, use “*firefox” and  firefox.exe has to be in the environment PATH.
    • The Code above assume that a selenium RC server is running at localhost on port 4444, I will show you how to start one later in this post.

    Some remarks about the code above

    You will have to somehow make your own Selenium framework out of the generated code, for obvious reasons,

    • You’ll have soon to support many browser "*iexplore", "*firefox", "*opera" and as such use environment variables or configuration files. I recommend you in that case to use Selenium Grid instead of Selenium RC.
    • You can not let the URL and port of the container fix coded http://localhost/helloworld, this URL may change if you target different runtime. 
    • You may want to reuse some part of the generated code multiple times, in different unit test, (like login/logout stuff), java inheritance, interface, and patterns may arrive sooner or later, even if this is unit test code.

    Selenium test cases or integration tests are meant to be run in phase “integration-test” against a running instance of your application. That is why don’t forget to deploy your application with Maven cargo or run it inside Jetty in Maven phase “pre-integration-test”

    How to use Surefire to run JUnit test in phase “test” and integration tests in phase “integration-test”

    The response is by carefully configuring Surefire and naming your Java packages. The pom.xml below show this trick

    In phase test, test cases with a word integration or selenium are omitted, while in phase “integration-test” they are run.

    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.4.2</version>
        <configuration>
            <argLine> -Xmx512m -DuseSystemClassLoader=true</argLine>
            <skip>false</skip>
            <testFailureIgnore>true</testFailureIgnore>
            <excludes>
                <exclude>**/integration/*</exclude>
                <exclude>**/selenium/*</exclude>
            </excludes>
        </configuration>
        <executions>
            <execution>
                <id>integration-tests</id>
                <phase>integration-test</phase>
                <goals>
                    <goal>test</goal>
                </goals>
                <configuration>
                    <skip>false</skip>
                    <excludes>
                        <exclude>none</exclude>
                    </excludes>
                    <includes>
                        <include>**/integration/*</include>
                        <include>**/selenium/*</include>
                    </includes>
                </configuration>
            </execution>
        </executions>
    </plugin>

    Now it is time to start a selenium server locally or remotely so we can start our newly defined test cases.

    Selenium Remote control

    Selenium Remote Control (RC) is a test tool that allows you to write automated web application UI tests in any programming language against any HTTP website using any mainstream JavaScript-enabled browser.

     

    You can either start a selenium RC server

    • Outside Eclipse, like any java process,
    • Inside Eclipse with a java launcher,
    • Inside Eclipse with Maven and a java launcher.
    • Inside any Maven Phase thank to a plugin XXXXXXXXXXXXX

    I recommend you to install Selenium RC in a dedicated VM (VMWare, Virtual desktop, XEN) and to make it team or enterprise wide. I would always recommend to put Linux at work for such a task, unfortunately Internet Explorer is not running at all on Mac nor Linux.. I recommend you this way to run a shared Selenium server in your infrastructure that can be later access by many continuous build agents.

    Selenium RC is a Java process requiring only 2 jar to properly start, Download and unpack to c:\selenium-server-1.0-beta-2

     

     

     

    Create following in a batch file start.bat, normally all you have to do is to change the first 3 lines

    set JAVA_HOME=c:\jdk1.6
    set FIREFOX_HOME=C:\tools\Firefox3
    set SELENIUM_RC_HOME=c:/selenium-server-1.0-beta-2
    
    set PATH=%PATH%;%FIREFOX_HOME%
    set CLASSPATH=%CLASSPATH%;%SELENIUM_RC_HOME%/selenium-server.jar;%SELENIUM_RC_HOME%/selenium-server-coreless.jar 
    %JAVA_HOME%/bin/java -jar %SELENIUM_RC_HOME%/selenium-server.jar

    If everything run properly, you should see an ugly DOS windows like the one below

     starting.selenium.rc.outside.eclipse

    Pointing the browser to http://localhost:4444 will return an error 403, which is no sign of malfunction, currently Selenium RC has no web GUI.

    seleniumRCtestInBrowser

     

    Start Selenium RC server inside Eclipse

    If you decide to run selenium RC inside Eclipse, you’ll be mainly benefit from:

    • Project sharing and versioning in CVS/SVN,
    • Command line parameters that will start Selenium RC can also be shared,

    Selenium Server in a Maven project/module

    Create a new Maven Project named “SeleniumServer” and copy into its pom.xml the following

    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
        <modelVersion>4.0.0</modelVersion>
        <groupId>com.waltercedric.maven</groupId>
        <artifactId>SeleniumServer</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <dependencies>
            <dependency>
                <groupId>org.seleniumhq.selenium.server</groupId>
                <artifactId>selenium-server-coreless</artifactId>
                <version>1.0-beta-2</version>
            </dependency>
            <dependency>
                <groupId>org.seleniumhq.selenium.core</groupId>
                <artifactId>selenium-core</artifactId>
                <version>1.0-beta-2</version>
            </dependency>
        </dependencies>
    </project>

    Create a Java launcher and use org.openqa.selenium.server.SeleniumServer as main class. If you start the launcher, you will see the following in eclipse console

    seleniumRC.started.in.eclipse

     

    Selenium RC is now waiting on port 4444 for Selenium Test case orders.

    You can now run your test case in eclipse like any other test case using Eclipse build in Junit runner.

  • apache_maven

    How to add dependencies graph to multi module projects. With this Maven plugin, you’ll be able to visualize Maven modules interdependencies and dependencies in any scope (compile, text, provided, system, runtime)

    depgraph:depgraph Can be used to draw a dependency graph from the project, the mojo is executed in. It traverses all dependencies and creates a graph using Graphviz. It draws a dependency graph just for your project. For a simple POM with no sub modules, it draws a graph of all dependencies (including transitive ones) below it. For a POM with sub modules, goes into each leaf POM and generates a separate graph for it.

     

    Here is an example of output on the plugin itself

    depgraph

     

    Install in all TeamCity Agent Graphwiz

    Graphviz is an open source graph visualization software. It has several main graph layout programs. See the gallery for some sample layouts. It also has web and interactive graphical interfaces, and auxiliary tools, libraries, and language bindings.

    Chance is that you are using Linux, so install is very easy and just a few click away for OpenSuse

    # zypper in graphwiz   

    or for Debian

    # apt-get install graphwiz

    In windows use the binary installer and put the graphwiz/bin in your PATH environment variable!

    Configure your POM

    Ideally put this in your parent pom inside the <build> </build> tag

    <plugin>
            <groupId>ch.elca.el4j.maven.plugins</groupId>
            <artifactId>maven-depgraph-plugin</artifactId>
            <version>1.7</version>
    </plugin>

    More configuration settings can be found HERE, now add either a new plugin repository location in your pom.xml (see below) or better in your artifactory proxy

      <pluginRepository>
        <id>elca-services</id>
        <url>http://el4.elca-services.ch/el4j/maven2repository</url>
        <releases>
         <enabled>true</enabled>
        </releases>
      </pluginRepository>

     

    Configure Teamcity build

    Add in the Maven runner of every TeamCity Build

    addDependenciesGraphGoalsInBuild

     

    Maven goals

    • depgraph:depgraph  Can be used to draw a dependency graph from the project, the mojo is executed in. It traverses all dependencies and creates a graph using Graphviz. It draws a dependency graph just for your project. For a simple POM with no submodules, it draws a graph of all dependencies (including transitive ones) below it. For a POM with submodules, goes into each leaf POM and generates a separate graph for it.
    • depgraph:fullgraph  Can be used to draw a dependency graph from the project, the mojo is executed in. It traverses all dependencies and creates a graph using Graphviz. It draws a graph for all the modules as they are interconnected. Same as depgraph for a simple POM, but for a POM with submodules, generates a combined dependency graph incorporating all modules.

    You may also want to let developer look at modules dependencies graph in TeamCity, so you may want to add to artifact path **/site/images/*.png => dependenciesGraph

    Artifacts are files produced by a build. After finishing a build, TeamCity searches for artifacts in the build's checkout directory according to the specified artifact patterns. Matching files are then uploaded to the server, where they become available for download. More ..

    artifactPath

     

    Configure Eclipse

    Install Graphviz and don’t forget to have it in PATH.

    You can share an eclipse Maven launcher in your parent project, right click on your pom.xml and select run as Maven configuration, specify either depgraph:fullgraph  or depgraph:depgraph   as goals

  • apache_maven

    What can you do to avoid that when you use one Maven dependency, to also inherit some other undesirable older
    dependency (which is to say from an older transitive dependency).

    The fix to this is to add an exclusion to the dependency in question.
    For example, if we start with a dependency upon version 1.2 of the jxpath library:

    <dependency>
       <groupId>common-jxpath</groupId>
       <artifactId>common-jxpath</artifactId>
       <version>1.2</version>
       <scope>compile</scope> <!-- default scope for sake of example-->
    </dependency>

    This dependency to jxpath 1.2 will bring in an old version of log4j 3.8. In order to ensure that I am using the latest
    versions of log4j (4.4),

    I need to put in an exclusion for these transitive dependencies of common-jxpath, which I do as follows:

    <dependency>
       <groupId>common-jxpath</groupId>
       <artifactId>common-jxpath</artifactId>
       <version>1.2</version>
       <scope>compile</scope> 
       <exclusions>
          <exclusion>
             <artifactId>junit</artifactId>
             <groupId>junit</groupId>
          </exclusion>
          <!-- I can put many of these here -->
    </exclusions> </dependency>

    Having excluded them, they will be any longer in the build.

    Now, there is still too many thing that can occur in the background

    • Another 3rd party artifact may include log4j by using a transitive dependencies, and then you will have to rely/trust transitive
      dependency mediation
    • You can explicitly include the versions that you want in all pom.xml or better in your parent pom.xml

    Transitive dependency mediation

    Dependency mediation - this determines what version of a dependency will be used when multiple versions of an artifact are
    encountered. Currently, Maven 2.0 only supports using the "nearest definition" which means that it will use the version of
    the closest dependency to your project in the tree of dependencies. You can always guarantee a version by declaring it
    explicitly in your project's POM. Note that if two dependency versions are at the same depth in the dependency tree, until
    Maven 2.0.4 it was not defined which one would win, but since Maven 2.0.5 it's the order in the declaration that counts: the
    first declaration wins.
    "nearest definition" means that the version used will be the closest one to your project in the tree of dependencies, eg. if
    dependencies for A, B, and C are defined as A -> B -> C -> D 2.0 and A -> E -> D 1.0, then D 1.0 will be used when building A
    because the path from A to D through E is shorter. You could explicitly add a dependency to D 2.0 in A to force the use of D 2.0

    find out what the transitive dependencies are?

    You can't control what you do not know!

    One that can be use during build stage or explicitly use on command line, is the maven plugin maven-dependency-plugin

       <build>
          <plugins>
             <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-dependency-plugin</artifactId>
             </plugin>
          </plugins>
       </build>

    and then use the goal dependency:tree, so a typical build strategy could look like 

    mvn clean install dependency:tree
    or
    mvn clean install dependency:list   (easier to tokenize in excel sheet)
    So it look like
    With no exclusions

    [INFO] [dependency:tree]
    [INFO] com.test:test:jar:0.0.1-SNAPSHOT
    [INFO] \- commons-jxpath:commons-jxpath:jar:1.2:compile
    [INFO]    +- xerces:xerces:jar:1.2.3:compile
    [INFO]    +- javax.servlet:servlet-api:jar:2.2:compile
    [INFO]    +- junit:junit:jar:3.8:compile
    [INFO]    +- ant:ant-optional:jar:1.5.1:compile
    [INFO]    +- xml-apis:xml-apis:jar:1.0.b2:compile
    [INFO]    +- jdom:jdom:jar:b9:compile
    [INFO]    +- commons-beanutils:commons-beanutils:jar:1.4:compile
    [INFO]    +- commons-logging:commons-logging:jar:1.0:compile
    [INFO]    \- commons-collections:commons-collections:jar:2.0:compile
    [INFO] [dependency:list]
    [INFO]
    [INFO] The following files have been resolved:
    [INFO]    ant:ant-optional:jar:1.5.1:compile
    [INFO]    commons-beanutils:commons-beanutils:jar:1.4:compile
    [INFO]    commons-collections:commons-collections:jar:2.0:compile
    [INFO]    commons-jxpath:commons-jxpath:jar:1.2:compile
    [INFO]    commons-logging:commons-logging:jar:1.0:compile
    [INFO]    javax.servlet:servlet-api:jar:2.2:compile
    [INFO]    jdom:jdom:jar:b9:compile
    [INFO]    junit:junit:jar:3.8:compile
    [INFO]    xerces:xerces:jar:1.2.3:compile
    [INFO]    xml-apis:xml-apis:jar:1.0.b2:compile

     

    With exclusions

    [dependency:tree]
    [INFO] com.test:test:jar:0.0.1-SNAPSHOT
    [INFO] \- commons-jxpath:commons-jxpath:jar:1.2:compile
    [INFO]    +- xerces:xerces:jar:1.2.3:compile
    [INFO]    +- javax.servlet:servlet-api:jar:2.2:compile
    [INFO]    +- ant:ant-optional:jar:1.5.1:compile
    [INFO]    +- xml-apis:xml-apis:jar:1.0.b2:compile
    [INFO]    +- jdom:jdom:jar:b9:compile
    [INFO]    +- commons-beanutils:commons-beanutils:jar:1.4:compile
    [INFO]    +- commons-logging:commons-logging:jar:1.0:compile
    [INFO]    \- commons-collections:commons-collections:jar:2.0:compile
    [INFO] [dependency:list]
    [INFO]
    [INFO] The following files have been resolved:
    [INFO]    ant:ant-optional:jar:1.5.1:compile
    [INFO]    commons-beanutils:commons-beanutils:jar:1.4:compile
    [INFO]    commons-collections:commons-collections:jar:2.0:compile
    [INFO]    commons-jxpath:commons-jxpath:jar:1.2:compile
    [INFO]    commons-logging:commons-logging:jar:1.0:compile
    [INFO]    javax.servlet:servlet-api:jar:2.2:compile
    [INFO]    jdom:jdom:jar:b9:compile
    [INFO]    xerces:xerces:jar:1.2.3:compile
    [INFO]    xml-apis:xml-apis:jar:1.0.b2:compile

     
    see Maven Dependency Plugin
  • apache_maven

    What you will learn in this Maven How To

    • How to generate JAXWS proxies stub against a local WSDL, remote WSDL
    • How to compile your maven project or module against a specific version of Java (here 1.6) using Maven Compiler Plugin
    • How to attach source code of your project with the binary artifact using Maven Sources Plugin
    • How to deploy artifact binary and artifact source artifact using Maven Deploy Plugin to a remote enterprise Artifactory Maven repository.

    Now if you run “mvn deploy” on the pom.xml, the following will be executed:

    • JAXWS will create proxy stubs in src/main/java
    • Maven will compile all proxy stubs in /target/classes
    • Maven will create in phase “install” two jar artifact jaxws-0.0.1-SNAPSHOT-sources.jar and jaxws-0.0.1-SNAPSHOT.jar in /target/
    • Maven will deploy these jars to artifactory so they can be used by all your developers.

    How to generate JAXWS proxies stub against a local WSDL, remote WSDL

    Add this to your pom.xml

        <dependencies>
            <dependency>
                <groupId>com.sun.xml.ws</groupId>
                <artifactId>jaxws-rt</artifactId>
                <version>2.1.3</version>
            </dependency>
        </dependencies>

    Add this to your settings.xml or pom.xml or better artifactory list of repositories

        <pluginRepositories>
            <pluginRepository>
                <id>java.net2</id>
                <url>http://download.java.net/maven/2/</url>
            </pluginRepository>
        </pluginRepositories>

    And configure the Maven JAXWS plugin, you can either use remote WSDL or local WSDL saved on disk

    <plugin>
        <groupId>org.codehaus.mojo</groupId>
        <artifactId>jaxws-maven-plugin</artifactId>
        <executions>
            <execution>
                <goals>
                    <goal>wsimport</goal>
                </goals>
            </execution>
        </executions>
        <configuration>
            <verbose>true</verbose>
            <!--
                <bindingFiles>
                <bindingFile>${basedir}/src/main/resources/binding.xml</bindingFile>
                </bindingFiles>
            -->
            <sourceDestDir>${basedir}/src/main/java</sourceDestDir>
            <!--
                <wsdlDirectory>c:\</wsdlDirectory> <wsdlFiles>
                <wsdlFile>stockquote.wsdl</wsdlFile> </wsdlFiles>
            -->
            <wsdlUrls>
                <wsdlUrl>http://www.webservicex.net/stockquote.asmx?WSDL</wsdlUrl>
            </wsdlUrls>
        </configuration>
    </plugin>

    If you now run mvn jaxws:wsimport or mvn install, the maven plugin will get the WSDL and create proxies stubs in src/main/java

    How to compile your maven project or module against a specific version of Java

    Just put inside the <plugins> section of  <build>, use at least java 1.5. Obviously I choose 1.6 in the example below

    <plugin>
        <inherited>true</inherited>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>2.0.2</version>
        <configuration>
            <meminitial>128m</meminitial>
            <source>1.6</source>
            <target>1.6</target>
            <!--
                <executable>${JAVA_HOME}/bin/javac</executable> <fork>true</fork>
                <verbose>true</verbose> <showDeprecation>true</showDeprecation>
                <showWarnings>true</showWarnings>
            -->
        </configuration>
    </plugin>

    How to attach source code of your project with the binary artifact using Maven Sources Plugin

    This small plugin will create a new jar file in /target with a classifier “sources” for you. Note that with <finalName/> you can deviate from standard naming scheme, which is of course not recommended.

    “A Maven 2 plugin that creates a project-version-sources.jar right along side the project-version.jar in the target directory. We are using the verify phase here because it is the phase that comes before the install phase, thus making sure that the sources jar has been created before the install takes place.”

    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-source-plugin</artifactId>
        <version>2.1</version>
        <configuration>
            <!--<finalName></finalName>
            -->
        </configuration>
        <executions>
            <execution>
                <id>attach-sources</id>
                <phase>verify</phase>
                <goals>
                    <goal>jar-no-fork</goal>
                </goals>
            </execution>
        </executions>
    </plugin>

    How to deploy artifact binary and artifact source artifact using Maven Deploy Plugin to a remote Artifactory

     

    Add to your pom.xml the following

    <distributionManagement>
        <repository>
            <id>internal.repo</id>
            <name>MyCo Internal Repository</name>
            <url>http://maven.waltercedric.com:8080/artifactory/libs-releases</url>
        </repository>
        <snapshotRepository>
            <id>internal.repo.snapshot</id>
            <name>MyCo Internal Repository</name>
            <url>http://maven.waltercedric.com:8080/artifactory/libs-snapshots</url>
        </snapshotRepository>
    </distributionManagement>

    And to your settings.xml your administrator credentials (note don’t try admin/password against my artifactory repository, I use way more complex passwords). Note that the maven deploy plugin is able to detect if your artifact contains the word SNAPSHOT, if yes it will use the <snapshotRepository></snapshotRepository> and internal.repo.snapshot credentials

    <servers>
        <server>
            <id>internal.repo</id>
            <username>admin</username>
            <password>password</password>
        </server>
        <server>
            <id>internal.repo.snapshot</id>
            <username>admin</username>
            <password>password</password>
        </server>
    </servers>

    Now if you run “mvn deploy” on the pom.xml, the following will be executed:

    • JAXWS will create proxy stubs in src/main/java
    • Maven will compile all proxy stubs in /target/classes
    • Maven will create in phase “install” two jar artifact jaxws-0.0.1-SNAPSHOT-sources.jar and jaxws-0.0.1-SNAPSHOT.jar
    • Maven will deploy these jars to artifactory

    Download

    I did create a new Download section Maven for all future resources download. You’ll find there ready to use Maven/Eclipse projects.

    References

     

  • apache_maven

    Maven projects are created using the New "Maven projects" from M2Eclipse, see here for more details
    http://docs.codehaus.org/display/M2ECLIPSE/Creating+Maven+projects

    Maven Modules are different beast, as they are suppose to have a parent in their hierarchy, if you use Maven
    you are already understanding what the differences are.

    You can get in troubles if you try to make a lot of maven projects in eclipse, although it may seems natural to do so. A lot
    of people have gone that path, and this may work if you use the relativePath trick

    • parent (contains the super pom)
    • common-api, reference the parent using the <relativePath>../parent/pom.xml</relativePath>
    • common-core reference the parent using the <relativePath>../parent/pom.xml</relativePath>
    • common-spring reference the parent using the <relativePath>../parent/pom.xml</relativePath>
    • services-api  ...you get the idea
    • services-core
    • services-spring

    Here is an example for common-spring:

    <project>
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.waltercedric</groupId>
    <artifactId>common-spring</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <parent>
      <groupId>com.waltercedric</groupId>
        <artifactId>parent</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <relativePath>../parent/pom.xml</relativePath>
     </parent>
    </project>

    maven.all.projects. Do not make everything a Maven project!

    With such a mapping you'll get in troubles! not all plug ins seems to support this kind of trick (relativePath)
    In the maven reactor (maven-deploy, maven-release for example). It is a lot wiser to represent
    your product or project also in eclipse in a tree manner. This will break the complexity, introduce
    more flexibility, so why not just using maven modules??
    Maven2 recommend to represent modules in a tree manner

    |--- parent                                (packaging : pom)
    |        | common                       (packaging : pom)
    |                   |common-api       (packaging : any)
    |                   |common-core     (packaging : any)
    |                   |common-spring   (packaging : any)
    |        | services                        (packaging : pom)
    |                   |services-api        (packaging : any)
    |                   |services-core      (packaging : any)
    |                   |services-spring    (packaging : any)

    To achieve that Goal, M2Eclipse use a trick and put all modules (common-xxx and services-xxx) in the
    same maven project (parent) while displaying a workspace that look like "project flat"...

    First create your product or project using a Maven Project

    one.maven.project

    focus on that project, and right click New... Others... Maven Module

    maven.module.helper new.maven.module

    click Next, in the next page, choose a POM packaging, as this module will contains module
    definition (aggregation of sub modules common-api, common-core,...)
    finish.module.creation

    The results is what maven expect: a parent directory with a set of modules and pom's

    result.maven.modules

    The workspace fake the view and show maven modules as what I call "ghost projects"

    Any changes in these "ghost projects" open in fact the Maven project (remember in fact
    the code is there), so any change in common-api force you to sync and version the Maven project.

    on the file system, it is even more visible, there is only one maven project (parent) and all its modules
    are interlinked into it!

    maven.interlinked.modules

    you can create an unlimited number of modules interlinked in the maven project...
    Some open source framework, prefers to name their first module level component-xxx, then all other modules-xxx

    |--- myProduct                                      (packaging : pom)
    |        | component-common                 (packaging : pom)
    |                   |module-common-api       (packaging : any)
    |                   |module-common-core     (packaging : any)
    |                   |module-common-spring   (packaging : any)
    |        | component-services                  (packaging : pom)
    |                   |module-services-api        (packaging : any)
    |                   |module-services-core      (packaging : any)
    |                   |module-services-spring    (packaging : any)