Stuff that's hard to look up
Candidates for utility classes
final ScheduledExecutor scheduler = Executors.newSingleThreadScheduledExecutor();
final AtomicReference<Result> value = new AtomicReference<Result>();
final CountDownLatch latch = new CountDownLatch(1);
scheduler.submit(new Runnable() {
/* Retrieve a result by polling every 30 seconds */
@Override
public void run() {
try {
final Result pollResult = // poll resource for result;
if (pollResult.isReady()) {
value.set(pollResult);
latch.countDown();
} else {
scheduler.schedule(this, 30L, TimeUnit.SECONDS);
}
} catch (Throwable t) {
log.error("Cancelling polling for result, because an unexpected Throwable occured:", e);
latch.countDown();
}
}
});
// Blocks until a result is available
while(true) {
try {
latch.await();
break;
} catch (InterruptedException e) {
log.error("Interrupted Exception occured:", e);
}
}
return value.get();
In this contrived example, when a Result
class is ready, its isReady
method returns true. You could as easily check
for null
, or something.
You could also use a Callable, which could throw the exception that is currently being logged.
Used undeclared dependencies can be found with mvn dependency:analyze
mvn dependency:analyze
mvn dependency:tree -Dincludes=groupId:artifactId:type:version
How can I see where the code using an undeclared dependency is?
Exclude the dependency and see what fails.
What’s the issue?
Potentially, your source code is using a library that your Project Object Model (pom) does not declare as a dependency.
How can I fix the issue?
Declare your dependency explicitly.
How can I avoid this issue?
Only use classes that resolve as a first-level dependency. The easiest way to do this is to be aware of the classes that should be available to your source code. Avoid making your class choices by seeing what’s available on your IDE’s autocomplete menu.
There is also “Transitive Dependency Management”, but that’s a bit heavy handed. See more details in Gradle’s nice overview.
Consider inserting one of the following snippets in testing code to simulate breakpoints:
Ctrl-C
):synchronized(this) { wait(); }
try { System.in.read(); } catch (Throwable t) { System.out.println("Error while paused: " + t); }
This is useful if you need to pause test execution half-way through, and have made up a good reason to avoid setting up breakpoints within a debugger.
This approach is not as powerful as a decent debugger’s breakpoint would be, but it’s still useful; particularly when placed within a test that is executed separately from your application. You end up pausing the test abruptly, so a single- threaded test framework (JUnit’s default) won’t get a chance to run your clean up methods until you resume.
Usually a test will do the following: set up fixtures (test data), exercise the system under test, and clean up the fixtures. Inserting this snippet just before the test framework tears down your fixtures allows you to save yourself the manual work setting up the test data manually and exercising the system by hand steps. and your test framework setting up the test data fixtures set up prior to easily stop a test and explore your application without manually setting up those fixtures:
synchronized(this) { wait(); }
This should be a decent start but you could add more functionality. You could even embed a REPL into your test to analyze your application state via a CLI. As a simple example, the ability to resume test execution can be added so that the test framework will attempt to clean up any test fixtures it set up.
A resumeable breakpoint can be implemented with something along the lines of:
try { System.in.read(); } catch (Throwable t) { System.out.println("Error while paused: " + t); }
If you do this and you run your tests with Maven’s Surefire Plugin (that’s the
default for the mvn test
command), you might run into a problem where your
test does not listen to your input. As far as I understand, this is because
Maven Surefire Plugin forks your test to a separate process that is
not aware of input coming in from your terminal’s stdin
. The workaround is
pretty simple once you know what to look for. Run your tests with the
-DforkCount=0
flag:
mvn -DforkCount=0 test
Use an annotation and an accompanying Annotation Processor, to generate Java classes from a Mustache template at compile time. Useful for generating boilerplate classes.
This topic is a little larger than a snippet allows, due to the amount of wiring
required. Please refer to the java_snippet_2014-08-02
repository
for extended details and example code.
Reference: http://deors.wordpress.com/2011/10/08/annotation-processors/
MessageDigest
and BigInteger
import java.math.BigInteger;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
private String asMD5Checksum(final String original) {
final byte[] bytes = original.getBytes("UTF8");
try {
return new BigInteger(1, MessageDigest.getInstance("MD5").digest(bytes))
.toString(16);
} catch (NoSuchAlgorithmException e) {
throw new RuntimeException(e);
}
}
These digests aren’t perfect, since they aren’t going to be zero-padded, but it should do for simple use cases.
Consider leveraging ThreadLocal<MessageDigest>::get()
to avoid instatiating
MessageDigest
on every call:
private static final ThreadLocal<MessageDigest> messageDigest = new ThreadLocal<MessageDigest>() {
@Override
protected MessageDigest initialValue() {
try {
return MessageDigest.getInstance("MD5");
} catch (NoSuchAlgorithmException e) {
throw new RuntimeException(e);
}
}
};
Package-level annotations apply to all classes in a package and live in
package-info.java
under that package’s directory.
Its specification is at JLS 7.4.1.
You can see an example of this in JAXB, which takes advantage of package-level
annotations in declaring the default namespace to use in a package. Actually, it
seems to be mandatory to declare a default namespace in package-info.java
if
namespaces used as “markers”, but are not backed by an XML schema. (Take that
last part with a grain of salt.)
com/foo/bar/package-info.java
:@javax.xml.bind.annotation.XmlSchema(namespace = "http://bar.foo.com/schema/1.0", elementFormDefault = javax.xml.bind.annotation.XmlNsForm.QUALIFIED)
package com.foo.bar;
You can then override the default namespace on sub elements within that package by using the proper annotation:
package com.foo.bar;
@XmlRootElement(namespace="")
public class Something {
@XmlRootElement
public static class Subthing { }
}
Both Something
and Something.Subthing
will have an empty namespace, despite
the package-level namespace being http://bar.foo.com/schema/1.0
BlockingQueue
final BlockingQueue<Foo> outbound;
final Collection<Foo> outboundElements;
while(true) {
outboundElements.add(outbound.take());
outbound.drainTo(outboundElements);
// Work with the contents of outboundElements for a while...
outboundElements.clear();
}
BlockingQueue::drainTo is not blocking, and BlockingQueue::take only takes a
single element. Putting the two together could be useful for applications that
have periods of high activity, low activity, and no activity at arbitrary times.
This should help keep the outbound
queue from filling up too much during busy
periods, while preventing your application from going through the work step
repeatedly with zero elements.
Note that this isn’t a very efficient algorithm, for when elements first start coming in. So, if the work you need to do is so expensive that doing it twice when you could do it once is prohibitive, consider exploring a more sophisticated approach.
jstat
$ /usr/java/jdk1.7.0_10/bin/jstat -gc -t -h 20 <PID> 5s | tee ~/jstat.out
The meaning of the column headers is detailed in http://docs.oracle.com/javase/7/docs/technotes/tools/share/jstat.html#gc_option.
Also interesting to note is that, by default, Java 7 seems to shrink the Eden space over time; at least, when it can do so. This causes an increase in the rate of the incremental garbage collection; however, this is nothing to be alarmed at as long as full garbage collection rate does not increase.
ExecutorService
// From ExecutorService's JavaDoc
private static void shutdownAndAwaitTermination(final ExecutorService pool) {
pool.shutdown(); // Disable new tasks from being submitted
try {
// Wait a while for existing tasks to terminate
if (!pool.awaitTermination(5, TimeUnit.SECONDS)) {
pool.shutdownNow(); // Cancel currently executing tasks
// Wait a while for tasks to respond to being cancelled
if (!pool.awaitTermination(5, TimeUnit.SECONDS))
System.err.println("pool did not terminate");
}
} catch (InterruptedException ie) {
// (Re-)Cancel if current thread also interrupted
pool.shutdownNow();
// Preserve interrupt status
Thread.currentThread().interrupt();
}
}
Consider using LOG::error
instead of System.err::println
.
Where LOG
is:
private static final org.slf4j.Logger LOG = org.slf4j.LoggerFactory.getLogger(CurrentClass.class);
Thread::join
This is a way of keeping the main thread (or any other one really) alive. Not always the right approach though.
final ExecutorService stayAlive = Executors.newSingleThreadExecutor();
stayAlive.submit(new Runnable() {
@Override
public void run() {
try {
Thread.currentThread().join();
} catch (InterruptedException e) {
} finally {
LOG.debug("resubmitting stayAlive runnable");
stayAlive.submit(this);
}
}
});
exec-maven-plugin
is a quick way to kick off Java scripts in a project that is already
using Maven.
mvn clean compile exec:java -Dexec.mainClass="canonical.name.of.class.Main"
Usually this class is going to live in the src/test directory instead of
src/main. In this case exec:java
will need the classpathScope="test"
as
well.
mvn clean compile exec:java -Dexec.mainClass="canonical.name.of.class.Main -Dexec.classpathScope="test"
This command is a heuristic so it doesn’t really care about the maven-release
plugin, basically it’s supposed to check out the latest master branch, delete
the local and remote tags, and undo 2 commits. It assumes that no one has
pushed any commits in since you made a release with maven-release-plugin
.
function git-undo-mvn-release { TAG_NAME=$1 && echo " >>> undoing last two commits and removing tag named '${TAG_NAME}' <<<" && echo " getting and checking out the latest copy of the the master branch" && git fetch && git checkout master && git pull origin master && git status && echo " removing local and remote tag ${TAG_NAME} if both exist" && git tag -d ${TAG_NAME} && git push origin :${TAG_NAME} && echo " git reset HEAD^^ --hard" && git reset HEAD^^ --hard && git status && git log --oneline | head && echo ' >>> If you are satisfied, run `git push origin master --force` <<<'; }
Here is a sample of how it should work:
$ git log --oneline | head -n 1
b747131 a real commit... the one you want to get back to.
$ mvn release:prepare
# ...
$ git log --oneline | head -n 3
db09eda [maven-release-plugin] prepare for next development iteration
00024fe [maven-release-plugin] prepare release test/1.0
b747131 a real commit... the one you want to get back to.
$ git-undo-mvn-release test/1.0
>>> undoing last two commits and removing tag named 'test/1.0' <<<
getting and checking out the latest copy of the the master branch
Already on 'master'
From https://github.com/yegeniy/yegeniy.github.io
* branch master -> FETCH_HEAD
Already up-to-date.
On branch master
nothing to commit, working directory clean
git reset HEAD^^ --hard
HEAD is now at b747131 a real commit... the one you want to get back to.
removing local and remote tag test/1.0 if both exist
Deleted tag 'test/1.0' (was db09eda)
To https://github.com/yegeniy/yegeniy.github.io.git
- [deleted] test/1.0
On branch master
nothing to commit, working directory clean
>>> If you are satisfied, run `git push origin master --force` <<<
$ git log --oneline | head -n 1
b747131 a real commit... the one you want to get back to.
$ git push origin master # --force
Easy to overlook, but you should include ellipses (...
) after a package
name, to also enable assertions on it and all its subpackages.
$ java -? 2>&1 |grep -A2 -e '-ea'
-ea[:<packagename>...|:<classname>]
-enableassertions[:<packagename>...|:<classname>]
enable assertions with specified granularity
So, use -ea:com.foo...
to enable assertions on com.foo
and all its subpackages.
mvn dependency:analyze | less
It will show you unused declared dependencies
, and used undeclared dependencies
so you can add or remove dependencies as appropriate. Just don’t go crazy with it.
Update: (mis?)Using this introduced some problems into my code because I removed dependencies whose transitive dependencies were being used.
2014/10/10 Update: The thing to pay attention to really is the used undeclared dependencies
. They list the landmines where you are using a transitive dependency directly.
Start Tomcat with catalina jpda start
. It will start Tomcat so that a remote debugger can be connected to port 8000.
Set up your debugger to run with the following options when the JVM is started:
-Xdebug -Xrunjdwp:transport=dt_socket,address=8000,server=y,suspend=n
In IDEA, this done in the Debug Configurations by setting up the “Remote” configuration.
http://wiki.apache.org/tomcat/FAQ/Developing#Q1 http://stackoverflow.com/questions/11480563/debugging-with-tomcat-and-intellij-community-edition
If you have internal dependencies (e.g. multi-module project), you should run
mvn install
on that dependency before running mvn package
on the multi-
module project. At least the first time, to get the dependencies installed into
your local repository (under ~/.m2/repository/
).