framework

Framework may refer to: read more at WikiPedia

  • IBM has put an interesting online article about  DWR (Direct Web remoting) technology...

    In the simplest terms, DWR is an engine that exposes methods of server-side Java objects to JavaScript code. Effectively, with DWR, you can eliminate all of the machinery of the Ajax request-response cycle from your application code. This means your client-side code never has to deal with an XMLHttpRequest object directly, or with the server's response. You don't need to write object serialization code or use third-party tools to turn your objects into XML. You don't even need to write servlet code to mediate Ajax requests into calls on your Java domain objects.
    Read more at IBM developerworks
  • The openSUSE team is proud to announce the release of openSUSE 10.3. Promoting the use of Linux everywhere, the openSUSE project provides free, easy access to the world’s most usable Linux distribution, openSUSE. openSUSE is released regularly, is stable, secure, contains the latest free and open source software, and comes with several new technologies.

    openSUSE 10.3 will be supported with security and other serious updates for a period of 2 years.

    This version contains new beautiful green artwork, KDE 3.5.7 and parts of KDE 4, SUSE-polished GNOME 2.20, a GTK version of YaST, a new 1-click-install technology, MP3 support out-of-the-box, new and redesigned YaST modules, compiz and compiz fusion advances, virtualisation improvements, OpenOffice.org 2.3, Xfce 4.4.1, and much more! Read on for details of what is new and available in openSUSE 10.3, and for all the necessary download links

    Screenshots are available here 

    Download Download Download Download Download!

    • 1 DVDcontaining OSS and NonOSS software (torrents for: i386, x86_64, ppc). Languages supported: English, Portuguese, French, Italian, Spanish, German, Chinese (Simpl. & Trad.), Japanese, Russian, Czech, Hungarian, Polish, Finnish, Danish, Swedish, Dutch
    • 1 CD with a default KDE installation (i386, x86_64, not for ppc, English only)
    • 1 CD with a default GNOME installation (i386, x86_64, not for ppc, English only)
    • 1 AddOn CD with only NonOSS packages (i386 or x86_64, ppc)
    • 1 AddOn CD with language packages that are used for extra languages (i386, x86_64, ppc, only to be used with DVDs!)

     

  • In computer programming, a unit test is a method of testing the correctness of a particular module of source code. The idea is to write test cases for every non-trivial function or method in the module so that each test case is separate from the others if possible.

    JUNIT: A testcase framework for Java

     

    In computer programming, a unit test is a method of testing the correctness of a particular module of source code. The idea is to write test cases for every non-trivial function or method in the module so that each test case is separate from the others if possible.

    Advantages
    The goal of unit testing is to isolate each part of the program and show that the individual parts are correct. It provides a written contract that the piece must satisfy. This isolated testing provides two main benefits:

    • Encourages change
      Unit testing allows the programmer to refactor code at a later date, and make sure the module still works correctly (regression testing). This provides the benefit of encouraging programmers to make changes to the code since it is easy for the programmer to check if the piece is still working properly.
    • Simplifies Integration
      Unit testing helps eliminate uncertainty in the pieces themselves and can be used in a bottom-up testing style approach. By testing the parts of a program first and then testing the sum of its parts will make integration testing easier.
    • Documentation
      As an added value, all your Testcases can act as a documentation for your set of classes

    Kent Beck, (CSLife) and Erich Gamma, (OTI Zürich) have made a very good article:
    "Testing is not closely integrated with development. This prevents you from measuring the progress of development- you can't tell when something starts working or when something stops working. Using JUnit you can cheaply and incrementally build a test suite that will help you measure your progress, spot unintended side effects, and focus your development efforts." more here

    Limitations
    It is important to realize that unit-testing will not catch every error in the program. By definition, it only tests the functionality of the units themselves. Therefore, it will not catch integration errors, performance problems and any other system-wide issues. Unit testing is only effective if it is used in conjunction with other software testing activities.

    Usage
    There is a lot of ways to use JUNIT:

    • Write your set of classes, then some Testcase that should run and validate the work done,
    • Write Testcases first that won't run because no classes are existing yet, then write the code that will make it run!
    • Correct a bug in a piece of code, and write a Testcase for being sure that it won't reappear one day.

    Junit is based on fact that you want to test a code. Normally you know the result expected, all you have to do is to ask your code (class, method, set of cooperating class) and to test if the response is correct.
    Let's take an example.... I have a Class that can replace patterns in a string (like in JDK 1.4.2: "aText".replace("seachPattern","withThisPattern"))). Since I wrote the class, and know the purpose of it, I can write some pertinent testcases. I want to protect this Object and all other Object that may use it from loss of functionnality, bugs which may lead to malfunction in a complex system.

    Writing good Testcases

    There is no rule how to write a test, but remember

    • That a testcase should be pertinent, otherwise it will have no quality impact and will lead to a loss of developer time.
    • Be honest: push your Objects to the limit of their usage! try to describe and test all functionnality of your set of objects.
    • You need to do some dummy/obvious assertions (but sometimes these dummy tests are not obvious with complex object and or runtime environment).
      Constructor should not give back the same instance
      (Except if you are using a singleton pattern)
      ClassA classA = new ClassA();
      ClassA classA1 = new ClassA();
      assertNotEquals(classA, classA1);
      

    The JUNIT language

    JUnit use some primitives methods to achieve regression testing. As today in JUNIT 1.3.8, The assertion methods are all located in junit.framework.AssertA lot of third party tools has been developed to extends possibilities of tests with database, EJB, JSP for example.

    • Assert methods are testing equality of nearly all Java standard type
    • If these methods are not enough, you can always decide to validate your objects by Your own and call fail() if you decide that conditions are not met.

    Write your first Testcase

    A Junit test is a classe which extends junit.framework.Tescase and has some methods beginning with the word "test"

    A trivial example:

    Your first JUNIT testcase classe
    public class SquareTest extends junit.framework.TestCase {
            public void testClassA {
            
             Square squareA = new Square();
             Square squareB = new Square();
            
             assertNotEquals(squareB,squareA);
             assertEquals(squareA.getName(),"ClassAadummyexample");
            
             //verifysetter,getter
             squareA.setX(2);assertEquals(2,squareA.getX());
             squareA.setY(4);assertEquals(4,squareA.getY());
            
             //perimeterofasquareis2X+2y
             assertEquals(12,squareA.getPerimeter());
             //surface
             assertEquals(8,squareA.getSurface());
            }
    
            public void testCloneability() {
            
             Square squareA = new Square();
             squareA.setX(10);
             
             Square squareB = (Square) squareA.clone();
            
             //ifSquaredonotimplemeentComparable,thefollowingistrue
             assertNotEquals(squareA,squareB);
             
             //testdeepClone
             assertEquals(10,squareB.getX());
            }
    }
    

    Writing a Testcase is always more or less the same:

    1. Create one or more classes extending junit.framework.Tescaseand implement some test methods
    2. Create in these methods instances of the object you want to test or validate.
    3. Use your object, use setter and getter, constructor to change their internal state (here is the concept of pushing your object to the limits: use the full range of input data accepted by your objects)
    4. Test values returned by methods, assuming that you know what would be the correct result,
    5. Write a lot of them to test the maximum of functionnalities provided by your objects.

    Run your testcases
    Different TestRunner or how to run your suite of testcases

    A TestRunner is able to run JUNIT testcases, there is more or less 2 categories:

    • Textual TestRunner (console output)
      • The fastest to launch and can be used when you don't need a red green success indication. This is recommended with ANT.
    • Graphical TestRunners (client server web GUI, swing, AWT in eclipse....)
      • They show a simple graphical dialog to start/stop and display results of tests and provide some graphical progress indication.

    A TestRunner can be configured to be either loading or non-loading. In the loading configuration the TestRunner reloads your class from the class path for each run. As a consequence you don't have to restart the TestRunner after you have changed your code. In the non-loading configuration you have to restart the TestRunner after each run. The TestRunner configuration can be either set on the command line with the -noloading switch or in the junit.properties file located in the "user.home" by adding an entry loading=false.

    JUNIT find all testcase using java.lang.reflexion package, in fact it will call all methods starting with the word test will be found.

    In a JAVA main class:
    String[] listUnitTest = {ClassA.class.getName(), ClassB.class.getName()}; //list of classname containing your units tests
    junit.textui.TestRunner.main(listUnitTest); //Text based
    junit.awtui.TestRunner.main(listUnitTest);//green mean all test successful red is bad in case of error, you see the stack and which test failed.
    junit.swingui.TestRunner.main(listUnitTest); //green mean all testcases successful red is badin case of error, you see the stack and which test failed.
    JUnit Testrunner in Eclipse is a standar View

    Testsuite
    Testsuite is a suite of testcase or method, you can give this testsuite to a testrunner.

    Some particular TestSuite

    Multi threading test
    If you need to have multiple threads hitting your class. ActiveTestSuite starts each test in its own thread However, ActiveTestSuite does not have a constructor which automatically adds all testXXX methods in a class to the test suite. I tried addTestSuite method with class name as the argument, but it added all tests in the class to run sequentially in the same thread. So, I had to manually add each test name to the ActiveTestSuite.
    public static Test suite() {
    TestSuite suite = new ActiveTestSuite();
    suite.addTest(new com.waltercedric.junit.ClassA ("testClonability"));
    suite.addTest(new com.waltercedric.junit.ClassA ("testSerialization"));
    suite.addTest(new com.waltercedric.junit.ClassA ("testRandom"));
    return suite;
    }

    public static void runTest (String[] args) {
    junit.textui.TestRunner.run( suite() );
    }

    Extensions
    JUNIT can be extended with 3rd party extensions, if you need some specials capabilities, refer to this page: JUNIT extensions

     

  • Log4J: A logging framework for J2EE

    Log4j homepage: http://jakarta.apache.org/log4j/

    Reference book on log4j:

    The Complete Log4j Manual

    by Ceki Gulcu
    Edition: Paperback

    Introduction
    Log4j is an open source tool (OSS) developed for inserting logs statements into your application and was developed by people at Apache fundation. It's speed and flexibility allows log statements to remain in shipped code while giving the user the ability to enable logging at runtime without modifying any of the application binary. All of this while not incurring a high performance cost/loss.

    Requirements

    • Log4j need at least a compatible JDK 1.1.x to run.
    • The DOMConfigurator is based on the DOM Level 1 API. The DOMConfigurator.configure(Element) method will work with any XML parser that will pass it a DOM tree. The DOMConfigurator.configure(String filename) method and its variants require a JAXP compatible XML parser, for example Xerces or Sun's parser. Compiling the DOMConfigurator requires the presence of a JAXP parser in the classpath.
    • The org.apache.log4j.net.SMTPAppender relies on the JavaMail API. It has been tested with JavaMail API version 1.2. The JavaMail API requires the JavaBeans Activation Framework package.
    • The org.apache.log4j.net.JMSAppender requires the presence of the JMS API as well as JNDI.
    • Log4j test code relies on the JUnit testing framework in order to maintain quality of release.

    Why inserting log statement or rely on this (old) technology?

    Advantages Drawbacks
    It offers several advantages. It provides precise context about a run of the application.
    Once inserted into the code:
    • It Help developer to develop and correct bugs,
    • Generation of logging output requires no human intervention,
    • Log output can be saved in persistent medium to be studied at a later time,
    • Rich logging package can also be viewed as an auditing tool, for example to determine performance...
    • Debugging statements stay with the program (for years) while debugging sessions are always transient (lifetime of bug resolution).
    • Log can make the glue between developer within a development environment and specialist within a production environment. The know how or description in log statements can help the production specialist to undestand how your application work.
    But
    • It can/May slow down an application.
    • If the program verbosity is high, it can pollute reader's mind, or lead to misanalyse of a problems.
      For example:
      - saying something false in a log statement can have tremendous effects...
      - Writing too much info (irrelevent), can hide some major error.

    Why choosing Log4J? (Fromapache.org)

    • log4j is optimized for speed. The system write has been rewrite for efficiency and is now asynchrone (compare to System.err)
    • log4j is based on a named logger hierarchy. (category)
    • log4j is fail-stop but not reliable.
    • log4j is thread-safe. No interblocking thread, or memory leaks.
    • log4j is not restricted to a predefined set of facilities.
    • Logging behavior can be set at runtime using a configuration file. Configuration files can be property files or in XML format.
    • log4j is designed to handle Java Exceptions from the start.
    • log4j can direct its output to a file, the console, an java.io.OutputStream, java.io.Writer, a remote server using TCP, a remote Unix Syslog daemon, to a remote listener using JMS, to the NT EventLog or even send e-mail. (Appenders)
    • log4j uses 5 levels, namely DEBUG, INFO, WARN, ERROR and FATAL.
    • The format of the log output can be easily changed by extending the Layout class.
    • The target of the log output as well as the writing strategy can be altered by implementations of the Appender interface.
    • log4j supports multiple output appenders per logger
    • log4j supports internationalization.
    • It is used extensively by thousands of Java developers. If a flaw is discovered it gets fixed in the next release.
    • The log4j code is likely to be better than code you'd write yourself and is l ikely to improve over time.
    • Ports to other languages are: C++, Eifel, Perl, .NET, Python, Ruby…more than 57 languages are supported

    Log4j concepts

    Logger Logger are responsible for handling the majority of log operations. The logger is the core component of the logging process.
    Levels Log4j by default can log messages with five priority levels (not including custom Levels). More can be defined by subclassing, but it is not recommended.

    debug to write debugging messages which should not be printed when the application is in production.
    log.debug("Starting init of RequestController");

    info for messages similar to the "verbose" mode of many applications.
    log.info("Analyser init successfull");

    warn for warning messages which are logged to some log but the application is able to carry on without a problem.
    log.warn("Inconsistent value in conf for key 'debug', line 123 assuming default value true");

    error for application error messages which are also logged to some log but, still, the application can hobble along. Such as when some administrator-supplied configuration parameter is incorrect and you fall back to using some hard-coded default value. You must use this level in all catch clause, if you can not resolve the exception!
    log.error("The object Account is null");

    fatal for critical messages, after logging of which the application quits abnormally
    log.fatal("Can not get any new connection from database");

    Notes:

    A logger will only output messages that are of a level greater than or equal to it. If the level of a logger is not set it will inherit the level of the closest ancestor. So if a logger is created in the package com.waltercedric.account and no level is set for it, it will inherit the level of the logger created in com.waltercedric. If no logger was created in com.waltercedric., the logger created in com.waltercedric.balance will inherit the level of the root logger, the root logger is always instantiated and available.

    Appender Appender
    1. Are responsible for controlling the output of log operations.
    2. Controls where and how logging result is store.

    The Appenders available are (from the log4j API)

    • ConsoleAppender: appends log events to System.out or System.err using a layout specified by the user. The default target is System.out
    • DailyRollingFileAppender extends FileAppender so that the underlying file is rolled over at a user chosen frequency.
    • FileAppender appends log events to a file.
    • RollingFileAppender extends FileAppender to backup the log files when they reach a certain size.
    • WriterAppender appends log events to a Writer or an OutputStream depending on the user's choice.
    • SMTPAppender sends an e-mail when a specific logging event occurs, typically on errors or fatal errors.
    • SocketAppender sends LoggingEvent objects to a remote a log server, usually a SocketNode.
    • SocketHubAppender sends LoggingEvent objects to a set of remote log servers, usually a SocketNodes
    • SyslogAppender sends messages to a remote syslog daemon.
    • TelnetAppender is a log4j appender that specializes in writing to a read-only socket.

    One may also implement the Appender interface to create ones own ways of outputting log statements.

    Layout Layout:
    1. Are responsible for formatting the output for Appender.
    2. Are always used by Appender
    3. Knows how to format the output.

    There are three types of Layout available:

    • HTMLLayout formats the output as a HTML table.
    • PatternLayout formats the output based on a conversion pattern specified, or if none is specified, the default conversion pattern.
    • SimpleLayout formats the output in a very simple manner, it prints the Level, then a dash '-' and then the log message.

    Using Log4j in your code

    It is not recommended to use log4j api directly, since who knows if a better logging framework won't do better in the future or if log4j won't modify its api's. The main idea is that when you aquire a 3rd party component, is to build a wrapper around it. It is even better if the wrapper contains an abstract factory: maybe in some case you wil have to use different class of logging (because of performance, licence...)

    A simple log4j wrapper
    Import com.waltercedric.LogWrapper;

    public LogWrapper {

    ...
    }

    Using your newly created wrapper
    Import com.waltercedric.LogWrapper;

    public void init() throws com.waltercedric.applicationException {

    LogWrapper logger = new LogWrapper(Account.class);
    logger.info("Starting init");

    logger.debug("create an Account");
    up = new Account(new NullObject());
    }

    Log4j Guidelines
    The FAQ of log4J is a must to read, here are the most important points:

    1. Respect Levels!
      Respect levels and categorize the logs according to severity and messages size. Please define a special logger (restricted to a package) that can be switch off and that do not write to much statement in log output.
    2. Meaningful statements
      Create code with System.err.println or System.out.println If you are doing some internal reviews of your code, please try to write some meaningful information in logs. Avoid log of type: "I am here", "here 1", "here 2" and so on..
    3. Classwide static logger
      It is recommended to provide a class wide logger access point, if you need to do a lot of output in a class or hierarchy. Define a protected Logger in the parent hierarchie
      public class Mamals {
      protected static LoggerWrapper logger = LogFactory.getLog(Mamals.class);
      ...
      }
      and use it in all children
      public class Human extends Mamals {

      public Human() {
      super();
      logger.debug("init");
      }

      }

    4. Increase speed
      Log4J is not slow, it is even faster than System.out or System.err (System.err or System.out are synchronous while NOT with log4j, the cost in times comes more from costs during formating messages!
      If you know that you must heavily formatted the output message, do not use the following:
      l.debug("Cash balance is " + cashvalue);
      use instead
      if(myLogger.isDebugEnabled()) {
      myLogger.debug("Cash balance is " + cashBalance.toXML());
      }
    5. How to name Loggers?
      You can name loggers by locality. It turns out that instantiating a logger in each class, with the logger name equal to the fully-qualified name of the class, is a useful and straightforward approach of defining loggers. This approach has many benefits:
    • It is very simple to implement.
    • It is very simple to explain to new developers.
    • It automatically mirrors your application's own modular design.
    • It can be further refined at will.
    • Printing the logger automatically gives information on the locality of the log statement.

    However, this is not the only way for naming loggers. A common alternative is to name loggers by functional areas. For example, the "database" logger, "RMI" logger, "security" logger, or the "XML" logger. You are totally free in choosing the names of your loggers. The log4j package merely allows you to manage your names in a hierarchy. However, it is your responsibility to define this hierarchy. Note by naming loggers by locality one tends to name things by functionality, since in most cases the locality relates closely to functionality.

    Remote logging over TCP
    Read carefully: http://jakarta.apache.org/log4j/docs/api/org/apache/log4j/net/SocketAppender.html and
    http://jakarta.apache.org/log4j/docs/api/org/apache/log4j/net/SocketHubAppender.html

    Starting the server.Chainsaw
    Chainsaw is a graphical logging client, where you can see, sort and filter logs data.
    Documentation can be read here: http://jakarta.apache.org/log4j/docs/api/org/apache/log4j/chainsaw/package-summary.html and it is a part of log4j.jar

    Starting chainsaw
    c:jdk1.4.2binjava org.apache.log4j.chainsaw.Main "chainsaw.port", "5000"
    1. Log4 gives you the ability to send messages to a remote location over a socket for logging purposes. The org.apache.log4j.net.SocketAppender and org.apache.log4j.net.SocketServer classes are the key classes used in remote logging.
    2. Modify all logger in your log4j.xml to use a SocketApender as appender, Once you have loaded this configuration, all messages will be written to the machine and port that you specify.
    3. Start the client application (Chainsaw), this program will receive logs and show them in a swing GUI
    Example of TCP appender in log4j.xml
    log4j.appender.remote =org.apache.log4j.net.SocketAppender
    log4j.appender. remote.RemoteHost=localhost
    log4j.appender. remote.Port=5000
    log4j.appender. remote.LocationInfo=true

    On the server side (where your application create logs), you will need to run log4j's SocketServer class. You can create a configuration file with configuration information similar to the following: The whole applcation is in DEBUG mode

    Example of socketserver.properties
    log4j.rootCategory=DEBUG,log1

    ############################
    # log1 is set to be a file

    log4j.appender.log1=org.apache.log4j.RollingFileAppender
    log4j.appender.log1.MaxFileSize=100KB
    log4j.appender.log1.MaxBackupIndex=1
    log4j.appender.log1.File=c://logs.log
    log4j.appender.log1.append = true
    log4j.appender.log1.layout=org.apache.log4j.PatternLayout
    log4j.appender.log1.layout.ConversionPattern=%p %t %c - %m%n
    1. Set up your CLASSPATH on both the client and server to contain log4j.jar
    2. Run the SocketServer at the command line. The command line syntax for the SocketServer is as follows:
      java org.apache.log4j.net.SocketServer portNumber configurationFile configurationDirectory
      start the server:
    Start the server
    java org.apache.log4j.net.SocketServer 5000 C:socketserver.properties C:temp
    org.apache.log4j.net.SocketServer "5000", "C:socketserver.properties", "C:temp"

    Start your application, without doing any change in your code or recompiling it, you can now log data remotely!

    Configuring log4j

    Location of configuration file
    The configuration files of log4j must be in classpath, if more than one are in classpath, the first found will be used. Log4j require to have a compatible parser in classpath in order to read the configuration file. As default, Logj use Crimson.jar

    Location of DTD
    The DTD is needed in order to initialize log4j, 2 solutions are available:

    Public DTD, the file must be on internet or on network System path, but with a fix path (URI)

    "http://www.waltercedric.com/log4j.dtd">

    Extending log4j

    Defining your application specific loggers, appenders and layouts
    You can look at the Log4j API to see how to implement a logger, appender and layout.

    Conclusions

    One of the strength of log4j is that is do not require to recompile the java code to binary classes to change considerably the ouput amount in logs. You can add logging statements in your code, and without changing the code shipped, change at runtime the amount of log output. Thus the major behaviour logging strategies are done in this file (it can be a properties file or a XML file). You should store this file in the classpath of your application.

    Annexes

    Example of configuration files:

    Example of log4j.xml













































































    Example of log4j.properties
    ###########################################################################
    #
    # log4Java properties
    #
    # Documentation can be found at
    http://jakarta.apache.org/log4j/docs/api/index.html
    # There is no other documentation except forum, a commercial book is due (oreilly)
    #
    # To permit reloading during runtime, the LogDecorator will test each 60s if the file has changed
    # and update configuration of log4j if needed
    #
    # Ascending prioriy INFO < WARNING < DEBUG < ERROR < FATAL
    # log visible only if current log level >= defined level
    #
    # current layout can be: DateLayout, HTMLLayout, PatternLayout, SimpleLayout, XMLLayout
    #
    ###########################################################################

    # Set root logger level to [FATAL|ERROR|WARN|INFO|DEBUG], and provide default appender

    log4j.rootLogger=DEBUG, stdout

    ############################
    # define category (and their level [INHERITED|FATAL|ERROR|WARN|INFO|DEBUG] and appender)
    # category should be fully qualified class name or incomplete package name
    # Note that you inherit from the root logger otherwise specified (set addtivity flag)
    #
    # additivity= true (default) all request will also be forwarded to the hierarchy
    # -> log twice if the same appender is already in the hierarchy
    # additivity= false do not forward to ancestor appenders
    #
    # INHERITED can be optionally specified which means that named category should inherit
    # its priority from the category hierarchy. If you add the flag additivity to false,
    # you do not inherit of appender
    ##

    log4j.category.com.waltercedric.account=INHERIT, log1
    log4j.additivity.com.waltercedric.account=false

    log4j.category.com.waltercedric=DEBUG, log1

    ########################################################
    # You Can defined as many appender as you want
    ########################################################

    ############################
    # stdout is set to be a ConsoleAppender.
    ##

    log4j.appender.stdout=org.apache.log4j.ConsoleAppender
    log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
    see http://jakarta.apache.org/log4j/docs/api/org/apache/log4j/PatternLayout.html
    log4j.appender.stdout.layout.ConversionPattern=%d %r [%t] %-5p %c - %m%n

    ##################################
    # log1 is set to be a file by date

    log4j.appender.log1=org.apache.log4j.DailyRollingFileAppender
    rollover each day at midnight, see DailyRollingFileAppender object
    log4j.appender.log1.DatePattern='.'yyyy-MM- dd
    by size
    #log4j.appender.log1=org.apache.log4j.RollingFileAppender
    #log4j.appender.log1.MaxFileSize=100KB
    #log4j.appender.log1.MaxBackupIndex=1


    #/WEB-INF/conf/Log4j.properties
    log4j.appender.log1.File=c://VirtualTransport.log
    log4j.appender.log1.append = true
    log4j.appender.log1.layout=org.apache.log4j.PatternLayout
    see http://jakarta.apache.org/log4j/docs/api/org/apache/log4j/PatternLayout.html
    #-4r [%t] %-5p %c %x - %m%n lead to 331 [main] ERROR com.waltercedric.account - classCastexception-->

    log4j.appender.log1.layout.ConversionPattern=%p %t %c - %m%n

    ############################
    # eMail logging
    #
    # SMTPAppender will store all the logging events on an
    # internal cache and it will send all the messages when
    # the TriggeringEventEvaluator you set with the
    # setEvaluatorMethod or the constructor parameter return true.
    # By default the evaluator is set with an instance of
    # DefaultEvaluator wich is a package-private class
    # defined in the same compilation unit of SMTPAppender.
    # This evaluator will return true only when the logging
    # event has a priority greater or equal than ERROR.
    ##

    log4j.appender.email=org.apache.log4j.net.SMTPAppender
    log4j.appender.email.Threshold=FATAL
    log4j.appender.email.SMTPHost=XXX.XXX.XXX.XXX
    log4j.appender.email.To=This email address is being protected from spambots. You need JavaScript enabled to view it.
    log4j.appender.email.From=This email address is being protected from spambots. You need JavaScript enabled to view it.
    log4j.appender.email.Subject=A Fatal error has occured in your application
    log4j.appender.email.BufferSize=1
    log4j.appender.email.layout=org.apache.log4j.PatternLayout
    see http://jakarta.apache.org/log4j/docs/api/org/apache/log4j/PatternLayout.html
    log4j.appender.email.ConversionPattern=%d{ABSOLUTE} (%F:%L) - %m%n

    ############################
    # remote socket server logging
    #
    # The SocketAppender has the following properties:
    # please read: http://jakarta.apache.org/log4j/docs/api/org/apache/log4j/net/SocketAppender.html
    #
    # If you want to have a server that listen, you can start the following utilities Chainsaw
    # (swing gui) read how at http://jakarta.apache.org/log4j/docs/api/org/apache/log4j/chainsaw/package-summary.html
    # Chainsaw is a particular server!
    ##

    log4j.appender.CHAINSAW_CLIENT=org.apache.log4j.net.SocketAppender
    log4j.appender.CHAINSAW_CLIENT.RemoteHost=localhost
    log4j.appender.CHAINSAW_CLIENT.Port=5000
    log4j.appender.CHAINSAW_CLIENT.LocationInfo=true

    Resources

  • &160;

    First a big thanks toPackt Publishing for having sent me this book to review! I did enjoy going through this book, while I did not learn a lot of new stuff (I am using Apache Maven daily since 2006!), I found it to be concise and would recommend it anytime to any of my colleagues. But let’s go through my review of this cookbook of over 50 recipes towards optimal Java Software Engineering with Maven 3:

    Apache Maven 3 Cookbook is a clear, precise, well-written book that gives readers clear recipes for the release process using Apache Maven 3. The authors give a step-by-step account of expectations and hurdles for software development.

    The first few chapters quickly bring you to the point to be comfortable using Maven on straightforward projects, and the later chapters provide even more recipes examples on subjects like running a Repository Manager, Writing Plugins, and details on various techniques. The book also covers numerous real world software delivery issues such as multi-module projects, web/enterprise projects, dependency management, automatic testing and documentation.

    To sum up key points from this 224 pages book in a few bullets:

  • Chapter 1: Basics of Apache Maven: Setting up Apache Maven on Windows/Linux/Mac, Creating a new project, Understanding the Project Object Model, build lifecycle and build profiles,
  • Chapter 2: Software Engineering Techniques: Build automation, modularization, Dependency management, Source code quality check, Test Driven Development (TDD), Acceptance testing automation and Deployment automation,
  • Chapter 3: Agile Team Collaboration: Creating centralized remote repositories, Performing continuous integration with Hudson, Integrating source code management, Team integration with Apache Maven, Implementing environment integration, Distributed development and Working in offline mode,
  • Chapter 4: Reporting and Documentation: javadocs, unit tests, coverage reports and Maven dashboard setup,
  • Chapter 5: Java Development with Maven: Java web application, J2EE, Spring, Hibernate and JBoss SEAM development,
  • Chapter 6: Google Development with Maven: Android and GWT (Google Web Toolkit), Google App Engine deployment,
  • Chapter 7: Scala, Groovy, and Adobe Flex
  • Chapter 8: IDE Integration
  • Chapter 9: Extending Apache Maven: creating plugins using Java, Apache ANT or Ruby,
  • The author Srirangan go into detail in describing each of these themes.&160;

    I recommend you this book if

  • If you need to learn Apache Maven quickly, you can go through the recipes and examples and come away with a good knowledge of Maven.
  • If you are currently implementing Apache Maven for the first time in your development process and feel a bit lost by the lack of clear examples that just run.
  • If you want to use proven solutions to real common engineering challenges: this book will save you a lot of time!
  • &160;

    if you want to be able to deliver your software to any target environment, using continuous delivery processes, chances are high that Apache Maven is the right tool for this job, and this book should be part of your technical library, beside also of course the free online book of Sonatype Maven: The Complete Reference

  • Thanks to Packt Publishing for having sent me this book to review. I will publish a review in the next coming days

    • Grasp the fundamentals and extend Apache Maven 3 to meet your needs
    • Implement engineering practices in your application development process with Apache Maven
    • Collaboration techniques for Agile teams with Apache Maven
    • Use Apache Maven with Java, Enterprise Frameworks, and various other cutting-edge technologies
    • Develop for Google Web Toolkit, Google App Engine, and Android Platforms using Apache Maven

    Apache_Maven3_Cookbook

    You may also consider reading all my articles related to Apache Maven

  • Got this email from Cyprian Sniegota, he did develop a Maven Archetype for easing development of Joomla extensions. His archetype currently support the creation of a skeleton for components, modules, plugins and templates.

    I noticed some time ago that you described combination of Joomla! and Maven. Few weeks ago i wrote joomla-maven-plugin with skeleton projects (sources: bitbucket.org/deviapps) based on php-maven.org work.
    Here is short description http://deviapps.com/create-joomla-extension-with-maven and 5 min video (in Polish so far) http://www.youtube.com/watch?v=aE8w9EZciTg
    I hope you will be interested.

    Thanks to him for having written this project. I will also try to Maven-ize what Joomla has done with Ant in the future (I prefer now crystal clear software lifecycle )

  • apache_maven

    cargo-banner-left

    Following the post about Deploy to Tomcat 6 using Maven, here is a ready to use example with the main differences explained in the table below

      Tomcat 7 Tomcat 6
    containerId <containerId>tomcat7x</containerId> <containerId>tomcat6x</containerId>
    Url of Tomcat manager <cargo.remote.uri> <cargo.tomcat.manager.url>
    example http://host..com/manager/text/ http://host..com/manager/
    tomcat-users.xml

    <tomcat-users>
    <role rolename="manager-gui"/>
    <role rolename="manager-script"/>
    <role rolename="manager-jmx"/>
    <role rolename="manager-status"/>
    <user username="admin" password="admin" roles="manager-gui,manager-script"/>
    </tomcat-users>

    <tomcat-users>
      <role rolename="manager"/>
      <user username="admin" password="admin" roles="manager"/>
    </tomcat-users>

    And finally a snippet of an Apache Maven pom.xml ready to use in a profile, so you can reuse this profile like a method call

    <profile>
     <id>deployTomcat</id>
    <activation>
      <activeByDefault>false</activeByDefault>
    </activation>
    <build>
     <plugins>
        <plugin>
         <groupId>org.codehaus.cargo</groupId>
         <artifactId>cargo-maven2-plugin</artifactId>
         <version>1.1.0</version>
        <configuration>
         <wait>true</wait>
         <container>
          <containerId>tomcat7x</containerId>
          <type>remote</type>
         </container>
         <configuration>
          <type>runtime</type>
          <properties>
           <cargo.remote.uri>
             ${tomcat.url}
           </cargo.remote.uri>
           <cargo.remote.username>
              ${tomcat.user}     
           </cargo.remote.username>
            <cargo.remote.password>
              ${tomcat.pwd}
            </cargo.remote.password>
          </properties>
          </configuration>
          <deployer>
           <type>remote</type>
           <deployables>
           <deployable>
            <groupId>${deploy.groupid}</groupId>
            <artifactId>${deploy.artifactid}</artifactId>
            <type>war</type>
            <properties>
             <context>${deploy.context}</context>
            </properties>
           </deployable>
          </deployables>
         </deployer>
        </configuration>
        <executions>
         <execution>
          <id>verify-deploy</id>
          <phase>pre-integration-test</phase>
          <goals>
           <goal>deployer-undeploy</goal>
           <goal>deployer-deploy</goal>
          </goals>
         </execution>
        </executions>
        </plugin>
     </plugins>
    </build>
    </profile>

    Place as many profiles as you have machine to deploy in settings.xml and declare some variables as properties, as shown below:

    <profile>
     <id>serverA</id>
     <activation>
        <activeByDefault>false</activeByDefault>
     </activation>
     <properties>
        <tomcat.url>http://host.com/manager/text</tomcat.url>
        <tomcat.user>admin</tomcat.user>
        <tomcat.pwd>admin</tomcat.pwd>
        <!-- these properties must be defined
           as system property or -D -->
        <!-- - deployable.artifactid:
             artifactId of web application to be deployed -->
        <!-- - deployable.context: web context name -->
     </properties>
    </profile>

    So you can run, and traget multiple host by just exchanging the name of the profile serverA to something else.

    mvn integration-test –PdeployTomcat,serverA
       –Ddeployable.artifactid=demo
       -Ddeploy.groupid=com.mycompany
       –Ddeployable.context=showcase
  • I will show you in an Apache Maven configuration file how to copy files to server each time the package phase is executed.

    Solution with Ant SCP task

    This snippet of code is a ready to use code that make use of Apache Ant task scp, Just put this snippet of code in your Maven module where the assembly is executed or anywhere else to push all tar.gz files to a server just run a maven mvn package, you can add as many ant task and push to many server the same file during the reactor build.

    <plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-antrun-plugin</artifactId>
    <version>1.7</version>
    <executions>
    <execution>
        <id>server-copy</id>
        <goals>
            <goal>run</goal>
        </goals>
        <phase>package</phase>
        <configuration>
            <target>
                <echo message="Push to server/home/"/>
                <scp trust="yes"
                    todir="user:password@server:/home/">
                    <fileset dir="${basedir}/target">
                        <include name="**/*.tar.gz"/>
                    </fileset>
                </scp>
            </target>
        </configuration>
    </execution>
    </executions>
    <dependencies>
    <dependency>
        <groupId>org.apache.ant</groupId>
        <artifactId>ant-jsch</artifactId>
        <version>1.8.2</version>
    </dependency>
    </dependencies>
    </plugin>

    Solution with maven-deploy-plugin

    The maven-deploy-plugin allows you to configure the deploy phase to deploy to a server using scp. There is a page in the documentation that describes how it can be done.

    Deploy maven artifact using Maven Wagon SCP

    Another alternative would be to use Maven Wagon SCP like described in this post for example

  • apache_maven

    In which order are Apache Maven profiles executed? are Apache Maven profiles ordered? how can you insured that Apache Maven profiles are activated in the right order?

    You normally don’t end up with these questions, issues may only appear if

    • Some profiles are dependent each other,
    • Some profiles can not run in any order,

    The use case behind this article is very simple, as I have a a continuous build were:

    • 5 web applications have  to be deployed into a remote tomcat in phase pre-integration-test,
    • 2 databases are created for test cases in phase generate-test-resources
    • 1 more database is created and needed for runtime, done in phase pre-integration-test
    • One of these web applications is able to inject data into database using web services, a profile do this in a profile in phase pre-integration-test
    • Selenium test cases are run in phase integration-test

    All these steps are done using several Apache Maven pom profiles.

    As it is a bit complicated to explain, lets first refresh some Apache Maven concepts

    Apache Maven Goals

    First you’ll have to keep in the mind Apache Maven lifecycle of modules, 21 goals out of the box:

    • Validate: validate the project is correct and all necessary information is available 
    • generate-sources: generate any source code for inclusion in compilation    
    • process-sources: process the source code, for example to filter any values    
    • generate-resources: generate resources for inclusion in the package    
    • process-resources: copy and process the resources into the destination directory, ready for packaging  
    • compile: compile the source code of the project    
    • process-classes: post-process the generated files from compilation, for example to do byte code enhancement on Java classes    
    • generate-test-sources: generate any test source code for inclusion in compilation
    • process-test-sources: process the test source code, for example to filter any values    
    • generate-test-resources : create resources for testing 
    • process-test-resources: copy and process the resources into the test destination directory    
    • test-compile: compile the test source code into the test destination directory    
    • test: run tests using a suitable unit testing framework. These tests should not require the code be packaged or deployed
    • prepare-package: perform any operations necessary to prepare a package before the actual packaging. This often results in an unpacked, processed version of the package    
      package     take the compiled code and package it in its distributable format, such as a JAR    
      pre-integration-test: perform actions required before integration tests are executed. This may involve things such as setting up the required environment   
    • integration-test: process and deploy the package if necessary into an environment where integration tests can be run     (selenium test cases for example)
      post-integration-test: perform actions required after integration tests have been executed. This may including cleaning up the environment
    • verify     run any checks to verify the package is valid and meets quality criteria    
    • install     install the package into the local repository, for use as a dependency in other projects locally
    • deploy    code is deployed in artifactory or copied with ftp/scp for distribution

    if you run the goal compile

    mvn compile

    on a simple multi module project, EVERY modules, one after the others,  will go through these phases
    validate –> generate-sources –> process-sources –> generate-resources –> process-resources –> compile

    Apache Maven reactor

    The reactor is the part of Apache Maven that allows to execute a goal on a set of modules. As mentioned in the Apache Maven 1.x documentation on multi-modules builds, while modules are discreet unit of work, they can be gathered together using the reactor to build them simultaneously and:

    The reactor determines the correct build order from the dependencies stated by each project in their respective project descriptors, and will then execute a stated set of goals. It can be used for both building projects and other goals, such as site generation.

    The reactor is what makes multi-modules build possible: it computes the oriented graph of dependencies between modules, derive the build order from this graph and then execute goals on the modules. In other words, a "multi-modules build" is a "reactor build" and a "reactor build" is a "multi-modules build".

    A simple multi modules project

    For the sake of the exmaple, it has modules and profiles dependencies, in myProject/pom.xml


    remoting
    web
    monitoring
    common
    services

    or if you prefer the directory layout

    myProject
        |_ pom.xml
        |_common
                     |_src
                     |_pom.xml
        |_ web
                     |_src
                     |_pom.xml
        |_ remoting
                     |_src
                     |_pom.xml
        |_ services
                     |_src
                     |_pom.xml
        |_ web
                     |_src
                     |_pom.xml

    Lets assume also I would like to apply a list of profiles named

    • deployWeb, deploy the war module using cargo to a running tomcat instance
    • createDatabase, create a mysql database from scratch
    • runSelenium, run selenium test in phase integration test against web, assume database is created first
    • deployMonitoring, deploy the war module using cargo to a running tomcat instance, query the web at startup to get some infos.

    Maven calculate the module order in reactor based on dependencies, as seen in logs file after running

    mvn compile

    [INFO] Reactor build order:  Unnamed - com.waltercedric:myproject:pom:0.0.1-SNAPSHOT
    Unnamed - com.waltercedric:common:jar:0.0.1-SNAPSHOT
    Unnamed - com.waltercedric:services:jar:0.0.1-SNAPSHOT
    Unnamed - com.waltercedric:remoting:ear:0.0.1-SNAPSHOT
    Unnamed - com.waltercedric:web:war:0.0.1-SNAPSHOT
    Unnamed - com.waltercedric:monitoring:war:0.0.1-SNAPSHOT

    Example

    It start to be complicated when you provide a list of profile using Apache Maven command line like this

    mvn post-integration-test –PdeployWeb,createDatabase,runSelenium,deployMonitoring

    Chances are high that you will get profile executed in wrong order, too early or too late..

    Rule 1 profiles are activated (if found) following reactor modules order

    The first rule is that profiles are activated in module reactor order first, if myProject is first it will go through all 18 phases of  Apache Maven (from validate to post-integration-test in my example). Keep in mind also that the list of profiles will be applied to EVERY modules in EVERY phase starting at the top most module in reactor.

    • On modules myproject:
      •  Apache Maven will activate profiles PdeployWeb,createDatabase,runSelenium,deployMonitoring if one or more in the list are present in myproject/pom.xml
    • On modules common,
      • Apache Maven will activate profiles PdeployWeb,createDatabase,runSelenium,deployMonitoring if one or more in the list are present in common/pom.xml
    • and so on….

    Rule 2  Reactor modules order “may” be changed

    And now the tricky part, you can normally NOT change the module order in reactor, that’s ok but….

    The order you define in myProject/pom.xml for   (=module aggregation) is still kept if the maven dependencies resolver don't see any problems

    Not clear enough? look at the 2 examples below:

    myProject/pom.xml mvn post-integration-test
    Reactor build order (seen in logs)
    Remarks

    remoting
    web
    monitoring

    common
    services
    1. myProject
    2. common
    3. services
    4. remoting
    5. web
    6. monitoring
    Maven adapt the order based on oriented graph of dependencies between modules.

    remoting
    monitoring
    web

    common
    services
    1. myProject
    2. common
    3. services
    4. remoting
    5. monitoring
    6. web
    Swapping module having no direct connections each others and having no conflicting dependencies to other may result in a different order in reactor!!!! and also different profile execution order.

    Since Apache Maven has detected that the module monitoring and web have no connections, it accept the “human/natural” order found in myproject/pom.xml.

    You may have to use this technique to distribute your profiles in pom.xml while still keeping the profile order execution under control.

    Rule 3 Maven profile order is not taken from command line

    The order of profile in the Apache Maven command line  –P list is not taken into account, running the expected profiles order

    mvn post-integration-test –PdeployWeb,createDatabase,runSelenium,deployMonitoring

    is equivalent to

    mvn post-integration-test –PcreateDatabase,deployMonitoring, deployWeb,runSelenium

     

     

    It is a good things, as it  simply make no sense across all modules and all Maven phase all combined together.

    Rule 4 You can force profiles to run in an order if you SORT them accordingly into ONE pom.xml

    Apache Maven recommend to place profiles into the module where they are acting.

    If I want to insure that profiles deployWeb, createDatabase are run before the profiles runSelenium you have to keep that order in the pom.xml even if these profiles are acting in different Maven phase

    • createDatabase  may run in phase generate-test-resources 
    • deployWeb run in phase pre-integration-test
    • runSelenium run in phase integration-test

    Considering the module ordering in reactor, a good pom.xml candidate could be web/pom.xml like this



        createDatabase
     


        deployWeb
     


        runSelenium
     

    References

    Profiles">http://maven.apache.org/pom.htmlProfiles

  • Some samples chapter which discusses the different approaches to code generation and looks at best practices for applying code generation techniques to the development of enterprise software for the J2EE platform. From the Book  Rapid J2EE™ Development: An Adaptive Foundation for Enterprise Applications (Prentice Hall PTR)
    Code generation methods offer a means of delivering enterprise solutions extremely rapidly by reducing the mundane, repetitive tasks developers face.
    www.eclipse.org is also hosting Eclipse Modeling Framework (EMF) EMF is a modeling framework and code generation facility for building tools and other applications based on a structured data model. From a model specification described in XMI, EMF provides tools and runtime support to produce a set of Java classes for the model, a set of adapter classes that enable viewing and command-based editing of the model, and a basic editor. Models can be specified using annotated Java, XML documents, or modeling tools like Rational Rose, then imported into EMF. Most important of all, EMF provides the foundation for interoperability with other EMF-based tools and applications.

  • joomla_logo

    These are the script I use to maintains all my 3 demo Joomla! sites:

    These scripts increased security and are trying to standardized how to create, update and maintain Joomla! demo site. Feel free to submit, send me ideas how to improve them or ask for help.

     

    This project is hosted at http://forge.joomla.org/gf/project/demosite/ under a GPL v3.0 license and the latest documentation can be found in my WIKI

    Architecture

    • 1 script (snapshotit.bat ) per Joomla! instance to create snapshots (files+ database) and save the result in a zip file.
    • 1 generic scripts (renew.sh) that renew an instance of Joomla! (files+ database) and secure it at the same time

    Prerequisites

    1. An access to a Linux bash on your server, ideally as root
    2. The possibility to define new crontab entries

    Locally

    On your desktop or reference server, install preferably in xampp/htdocs as much version of Joomla! as needed. These directories are containing Joomla versions . In these versions you will be able to install, remove configure your extensions. I personally have them  in XAMPP

    demo-joomla-1.0/
    demo-joomla-1.5/
    demo-joomla-1.6/

    In each of these Joomla! installation, copy this file snapshotit.bat inside and configure the variables accordingly. The file is well documented to not describe these variables here.

    This small batch file is making a snapshot of all files and database and create a new file demo-joomla-1.5.zip for example.

    Consider while installing Joomla!

    1. To not choose as a default for table name the prefix jos_ but something longer and more random, something like gZ45dF_ to mitigate SQL injection
    2. Do not name your admin user admin, but choose something longer and more random, Fdhtz56df_Gdte34 to reduce risk of brute forcing the administrator login/sql injection

    On the server

    Copy now this file demo-joomla-1.5.zip to your server, using FTP, SSH

    Copy also renew.sh to your server, using FTP, SSH

    Setup crontab

    Add to your crontab for each of your demo site the following big line, I renew demo site every 30 minutes

    $ crontab -e

    add this line

    30      *       *       *       *       locationOf_renew.sh locationOf_zip locationof_httpdocs dbuser dbpassword dbtablename unixuser unixgrp

    where

    • locationOf_renew.sh fully qualified path to renew.sh
    • locationOf_zip  fully qualified path of zip file (containing Joomla! and .sql file)
    • locationof_httpdocs fully qualified path of the httpdocs directory where this zip file content will be extracted
    • dbuser : database user that is used by Joomla!
    • dbpassword : database user password that is used by Joomla!
    • dbtablename: database schema name that is used by Joomla!
    • unixuser: unix user that is supposed to own all files in httpdocs, for example cedric
    • unixgrp: unix user that is supposed to own all files in httpdocs, for example psaserv

    Renew.sh

    This script renew.sh is doing the following with the zip file

    1. Delete all files in locationof_httpdocs removing all potential security threat and settings changes by visitors of your demo site
    2. Lock the demo site by adding an htaccess and htpasswd files temporary
    3. Unzip all file in demo-joomla-1.5.zip  to locationof_httpdocs
    4. Restore the database with the file demo-joomla-1.5.sqlfound in demo-joomla-1.5.zip
    5. Change user and usergrp to the right one (unixuser, unixgrp)
    6. Change all files and directory to the minimum required set of permissions (555 for directory and 444 for files)
    7. Make the cache directory of Joomla! read write for the owner unixuser
    8. Delete the file  demo-joomla-1.5.sql
    9. It remove potentially dangerous components from demo site, among others
      1. com_media Removing the users the right to upload, alter or delete files
      2. com_config Removing the users the right to change configuration
      3. com_installer Removing the users the right to install extensions
      4. it remove installation or installation.old if present
    10. Unlock the demo site by removing the htaccess and htpasswd files, and restoring the one from the zip files

    All in all and thanks to this development, my 3 demo site are now online, update will be a lot easier and I will keep them more often up to date Smile

    Joomla! 1.0 tricks

    In Joomla! 1.0 configuration.php I use the following trick to not have any stage dependent values.

    $mosConfig_absolute_path = dirname(__FILE__);
    $mosConfig_cachepath = dirname(__FILE__).'/cache';
  • Stop waiting for build & deploy make code changes. Write code and refresh your browser!

    Use DCEVM and add java fields, methods, classes and use them without restarting your application server, it's a modification of the HotSpot VM that allows unlimited class redefinition at run-time. You can add/remove fields and methods and change the super types of a class at run-time. The features of DCEVM are likely to be integrated in a future update of Java 8 as part of JEP 159.

    View code changes instantly and increases team velocity!

              DCEVM                  JVM Hot Swap         
    Changes to method bodies  yes yes
    Adding/removing Methods  yes  no
    Adding/removing constructors  yes  no
    Adding/removing fields  yes  no
    Adding/removing classes  yes  no
    Adding/removing annotations  yes  no
    Changing static field value  yes  no
    Adding/removing enum values  yes  no
    Modifying interfaces  yes  no
    Replacing superclass  yes  no
    Adding/removing implemented interfaces  no  no
    Initializes new instance fields  yes  no

     

  • Beyond Corp project scrap the notion of a corporate network and move to a zero-trust model....

    Google sees little distinction between boardrooms and bars, cubicles and coffee shops; all are untrusted under its perimeter-less security model detailed in a paper published this week. The "BeyondCorp model" under development for more than five years is a zero-trust network model where the user is king and log in location means little. Staff devices including laptops and phones are logged into a device inventory service which contains trust information and snapshots of the devices at a given time. Employees are awarded varying levels of trust provided they meet minimum criteria which authors Barclay Osborn, Justin McWilliams, Betsy Beyer, and Max Saltonst all say reduces maintenance cost and improves device usability (PDF)

    White Paper 
    https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/44860.pdf 

     

  • apache_maven_thumb

    Scenarios

    • You want to control Maven during dependency resolution and break the build if some conditions are not met,
    • You want to detect dependencies conflict early during the build,
    • You want to avoid anybody in your team to use the dependency x in the version y

    This is where the Maven Enforcer Plugin will assist you:

    The Enforcer plugin provides goals to control certain environmental constraints such as Maven version, JDK version and OS family along with many more standard rules and user created rules.

    Add in your pom.xml the following to configure the plugin

  • apache_maven

    Packt publishing has offered me to make a review of two of their new books (Thanks to them). I should receive free samples for review beginning of next week. Since it is two of my favorite subjects (Maven and Joomla!® ), I think it may also interest you. A review will follow in some days.

    ApacheMaven2EffectiveImplementation  

    Build and Manage Applications with Maven, Continuum, and Archiva

    • Install Apache Maven and follow the sample application to build up your project as quickly as possible
    • Test your applications to ensure maximum stability using Maven's inbuilt tools
    • Use Maven's report and checking tools to ensure the health of your projects
    • Explore Apache Continuum which will help you to ensure the health of your source code
    • Improve your team builds with the powerful combination of Maven, Archiva and Continuum
    • Install and run the repository manager Apache Archiva
    •  

     

    Joomla!1.5DevelopmentCookbook

    Joomla!® 1.5 Development Cookbook 

    • Make your extensions extensible, add extensions points to allow third parties to customize your extension
    • Create international extensions by enabling multilingual capabilities
    • Build more than just HTML pages - create PDF documents, Atom Feeds, and more!
    • Improve the user experience by adding Ajax
    • Create Atom and RSS feeds to keep users up-to-date
    • Utilize the power of Subversion to maintain your source code
    • Execute database queries and handle returned data in order to access and modify your data
    • Dynamically extend your database tables using JParameter to make your extensions more flexible
    • Keep your gremlins at bay by handling errors the Joomla! way
    • Work with the file system, interrogate existing files and folders and store data in the file system
    • Take control of your workflows by using www.JoomlaCode.org to manage your Joomla! projects
  • The Apache  software fundation has put a new project in it's incubator line: Harmony

    A free open source Java virtual machine and classes librairies!!!

    Right now it is only a thread, and a FAQ...an interesting discussion has start on Slashdot.org  

    Purpose of this project is to create and use an open source, compatible implementation of J2SE
    5, the latest version of the Java 2 Standard Edition specification.

    "The Apache Software Foundation provides support for the Apache community of open-source software projects. The Apache projects are characterized by a collaborative, consensus based development process, an open and pragmatic software license, and a desire to create high quality software that leads the way in its field. We consider ourselves not simply a group of projects sharing a server, but rather a community of developers and users."

    Update: Sun executives have endorsed the project, and the company might even participate. found at www.informationweek.com

  • I will try to keep an history of all my previous machine on this page...yes it is just a nerd/geek behavior...

    geek-inside

    In 15 years...

    • Cost in euro are in green, are similar in amount, you just get more size, disk, speed for your buck
    • 1208 times more IPS and my PC is still sometimes hanging (In order Vista, XP, Linux I look at you) :-(
    • 81 times more disk space!
    • Power consumption 360 watts today :-(
    • PC are still slower than what I expect...

     

    Instructions per second (IPS) is a measure of a computer's processor speed. Many reported IPS
    values have represented "peak" execution rates on artificial instruction sequences with few branches,
    whereas realistic workloads consist of a mix of instructions and even applications, some of which take
    longer to execute than others. The performance of the memory hierarchy also greatly affects processor
    performance, an issue barely considered in MIPS calculations. Because of these problems, researchers
    created standardized tests such as SPECint to (maybe) measure the real effective performance in
    commonly used applications, and raw IPS has fallen into disuse. [WikiPedia]

  • Impact of Zend Optimizer on PHP Performance

    The Zend Optimizer FAQ answers the question "Why use the Zend Optimizer?" with this statement: "The standard Zend run-time compiler used by PHP is indeed extremely fast, generating code that is usually 2 to 10 times faster. But an application that uses the Zend Optimizer typically executes another 40% to 100% faster."

    Read the results of the load test HERE.

  •  iText is a library that allows you to generate PDF files on the fly. The iText classes are very useful for people who need to generate read-only, platform independent documents containing text, lists, tables and images. The library is especially useful in combination with Java(TM) technology-based Servlets: The look and feel of HTML is browser dependent; with iText and PDF you can control exactly how your servlet's output will look.
    iText requires JDK 1.2. It's available for free under a multiple license: MPL and LGPL. It also have a complete list of features (but outdated), and let you perform operation on already created pdf (concat, split, add pages, add an overlay and so on...) 

  • apache_maven

    The Maven Dependency Plugin among other things include a dependency:analyze-duplicate

    The dependency plugin provides the capability to manipulate artifacts. It can copy and/or unpack artifacts from local or remote repositories to a specified location.

    This Apache Maven plugin is really feature rich and provide a lot of interesting goals:


  • For a long time using multimedia functionality in an application posed a big challenge to most developers. It is high time for the great media frameworks that have emerged to get an easy to use, but still powerful, interface.

    KDE has been using aRts&160; as its media framework and multimedia API since KDE 2.0. aRts has been a great framework in many ways, but didn't manage to keep up and is unmaintained&160; by now. In the meantime many good alternatives have come up, including, but not limited to, libxine&160; gstreamer&160; NMM&160;and Helix.

    With Phonon&160; KDE developers will be able to write applications with multimedia functionality in a fraction of the time needed with one of the above mentioned media frameworks/libs. This will facilitate the usage of media capabilities in the KDE desktop and applications

  • Here is a How to since it take me a very long time to install something which should have been trivial....
    For the benefit of the community, I am publishing it here on my free time :-)... Enjoy...

    Apache Axis and Apache Axis C++ are implementation of the SOAP ("Simple Object Access Protocol") submission to W3C. From the W3C draft specification:

    SOAP is a lightweight protocol for exchanging structured information in a decentralized, distributed environment. It is an XML based protocol that consists of three parts: an envelope that defines a framework for describing what is in a message and how to process it, a set of encoding rules for expressing instances of application-defined datatypes, and a convention for representing remote procedure calls and responses.

    Axis C/C++ (Axis CPP) is a non-Java implementation of Axis. At its core Axis CPP has a C++ runtime engine. The provided tooling allows you to create C++ client-side stubs and server-side skeletons. The server skeletons can be deployed to either a full Apache web server using the supplied apache module or a "simple_axis_server" - which is a simple HTTP listener (designed to help you test your services).


    1. Download either Apache 2.0 or Apache 1.3
    unpack it to c:\apache

    2. Downloadthe latest version of Axis C++
    unpack it to c:\axis

    Note: try to avoid space in path, it has always been proved to be a mess under Windows.

    3. Open  and add following lines at the end of Apache config file

    Apache 1.3

    in c:\apache\conf\http.conf
    Apache 2.0

    in c:\apache\conf\http.conf
    LoadModule axis_module ../axis/lib/modules/mod_axis.dll
    <Location /axis>
    SetHandler axis
    </Location>
    LoadModule axis_module ../axis/lib/mod_axis2.dll
    <Location /axis>
    SetHandler axis
    </Location>

    4. Now it is starting to be interesting, both mod_axis.dll  and mod_axis2.dll have dependencies to an old dll which do no more exist since windows 98 ! msjava.dll

    You need to copy from Internet msjava.dll into c:\axis\lib
    You also need to copy the xerces parser xerces-c_2_2_0.dll  into c:\axis\lib

    Axiss C++ required library in lib directory

    5. Create a file axiscpp.conf(or search in zip distributions for a template of it) and copy it into c:\axis
    Modify all path to DLL accordingly. Note that relative path or absolute path also work.

    Example of  axiscpp.conf
    # The comment character is '#'
    Available directives are as follows
    #(Some of these directives may not be implemented yet)
    #
    WSDDFilePath:The path to the server wsdd
    LogPath:The path to the axis log
    ClientLogPath:The path to the axis client log
    ClientWSDDFilePath:The path to the client wsdd
    Transport_http:The HTTP transport library
    Transport_smtp:The SMTP transport library
    XMLParser:The xml parser library
    NodeName:Node name
    ListenPort:Listening port
    Channel_HTTP:The HTTP transport channel library
    Channel_HTTP_SSL:The HTTP transport secure channel library

    LogPath:c:\axis\logs\AxisLog.txt
    WSDDFilePath:c:\axis\conf\server.wsdd
    XMLParser:c:\axis\lib\AxisXMLParser.dll
    Transport_http:c:\axis\lib\HTTPTransport.dll
    Channel_HTTP:c:\axis\lib\HTTPChannel.dll
    Channel_HTTP_SSL:c:\axis\lib\HTTPSSLChannel.dll

    6. Go into c:/apache and create a small batch file there 

    example of start script for ApachestartApache.bat
    SET AXIS_HOME=c:\axis
    SET AXISCPP_DEPLOY=c:\axis
    SET PATH=%PATH%;c:\axis\lib
    SET LIB_PATH=%LIB_PATH%;c:\axis\lib

    apache.exe
    pause

    7. create a file in c:\axis\conf\server.wsdd
    Example of c:\axis\conf\server.wsdd
    <?xml version="1.0" encoding="UTF-8"?>
    <!-- The Entity, wspath in the following internal subset allows setting a
         path for the webservices location -->
    <!DOCTYPE vars [ <!ENTITY wspath "/home/sanjaya/Axis/webservices/"> ]>

    <deployment xmlns="http://xml.apache.org/axis/wsdd/"
                xmlns:C="http://xml.apache.org/axis/wsdd/providers/C"
                xmlns:CPP="http://xml.apache.org/axis/wsdd/providers/CPP">
        <globalConfiguration>
        </globalConfiguration>
        <service name="transportProperties"
                 provider="CPP:DOCUMENT"
                 description="This is a simple test">
            <parameter name="className"
                       value="c:\axis\dll\calculator.dll"/>
            <parameter name="allowedMethods" value="add subtract"/>
        </service>
    </deployment>

    Use script 6.  to start Apache, if you get an exception, read it or verify DLL dependencies with DLL dependency walker


    Go In a Browser and type http://localhost/axis

    Axis-cpp-Working-in-apache
  • I put some effort the last few days in this new framework.

    Done:

    • I did document some part of it at http://wiki.waltercedric.com/index.php?title=ContinuousBuildforJoomla
    • TeamCity is installed/configured/documented (windows only)
    • XAMPP is installed/configured/documented (windows only)
    • I also at the same time configured a Puppy Linux VMWare image that will permit to anybody to have a running environment in no time.
    • I am able to unpack all Joomla! versions that are committed in any repository (CVS, SVN, Clearcase)
    • They can be unpacked anywhere on file system (config in setEnv.xml), ideally they are unpacked in the root htdocs of XAMPP
    • Code is committed at Joomla forge in SVN http://joomlacode.org/gf/project/continbuild4j/

    Issues

    Selenium test suite do not accept a baseurl (or only a part of it) so I have a full path like /Joomla1.5.7/installation/index.php in a selenium test case instead of /installation/index.php in test case and baseurl= http://localhost/Joomla1.5.7)

    Architecture

    3rd Party

    • I use antelope for some advance ANT operations: substring, indexof, loop
    • I use selenium java server on port 4444 (default)

    Cluster

    All cluster operations are in cluster.xml these basic functions are

    • cluster.init
      • cluster.remove&160;&160;&160;&160;&160;&160;&160; remove all instances of Joomla! in checkout directory
      • joomla.unpack.all&160;&160;&160; unpack all Joomla! versions that are checked in another SVN root
      • joomla.install.all&160;&160;&160;&160;&160; run the same selenium test case joomla.install.html on all Joomla! instance
      • joomla.remove.all.installation&160;&160; remove all Joomla! installation directories
      • joomla.check.all&160;&160;&160;&160; check all Joomla! installations for correctness
    • cluster.start
    • cluster.remove
    • cluster.stop

    Joomla!

    All Joomla specific operations are in joomla.library.xml

    • Unpack a Joomla! version
    • Remove the installation directory form a version
    • Apply a selenium test suite install.joomla.html that can use the regular Joomla! installer
    • Is also able to do all of the above on all Joomla! versions found (regular expression) in checkout directory

    Selenium

    • All selenium operations are in selenium.xml
    • All test suite and test cases are in /selenium/joomla/

    PHPUnit

    All PHPUnit operation are in phpunit.xml

    Settings

    Settings are in setEnv.xml, in future I will lazy load a file if it exist as environment variable

    &160;

    If you know ANT, the code is quite readable...