Eclipse Apache Hadoop Plugin Build 1.0.2
So, there is no good tutorial on how to build the current apache hadoop eclipse plugin, so here it is, it all its minimal glory:
Get Ant:
sudo emerge ant
Get eclipse, and put it somewhere. The location we shall now refer to as $ECLIPSE_HOME (for me ~/Desktop/eclipse/) , you got the idea
Download the apache hadoop src, same version as me, 1.0.2, if you are lazy, here is a link, verify at will:
http://apache.osuosl.org/hadoop/common/hadoop-1.0.2/hadoop-1.0.2.tar.gz
extract hadoop into a directory, we shall now refer to as $HADOOP_HOME. (i.e. ~/Desktop/eclipse/libs/hadoop-1.0.2/ for me)
cd $HADOOP_HOME
Then edit: vim src/contrib/eclipse-plugin/build.properties
add this line:
eclipse.home = $ECLIPSE_HOME
(if you did not export these please replace eclipse home with your real directory, don’t be dumb)
Now, yes you will get build errors, chill.
First:
Get sun’s jdk SE 5 (To build forrest):
http://www.oracle.com/technetwork/java/archive-139210.html
(annoying reg I know)
cd ~/Downloads/ chmod +x jdk-1_5.......(hit tab) sudo ./jdk-1_5.......(hit tab) (agree to license, spacebar to skip pages) mv jdk1.5.0_....(hit tab) /opt/sun-jdk-1.5
Download Apache forrest 0.8 (0.9 will crash… lol)
http://archive.apache.org/dist/forrest/0.8/
mkdir ~/apache cd ~/apache tar -xvzf ~/Downloads/apache-forrest-0.8...(hit tab) mv apache-forrest...(hit tab) forrest-0.8
Now lets build this baby, go to your hadoop directory:
ant clean package -Djava5.home=/opt/sun-jdk-1.5 -Dforrest.home=$HOME/apache/forrest-0.8
replace home with /home/your_user_name
should build, if not, let me know
now go into the built plugin directory:
cd build/contrib/eclipse-plugin/ cp cp hadoop-eclipse-plugin-1.0.3-SNAPSHOT.jar $ECLIPSE_HOME/plugins/
Close & restart eclipse, and BOOM!
Chose open perspective -> other -> Map/Reduce
File->New->Project->MapReduce Project
Select Configure Hadoop Install Directory
choose $HADOOP_HOME
make a project name, and boom
see my other tutorial on getting started with hadoop
Common Errors and Solutions:
1. Make sure you have ant installed, it is a “make” like build system for java:-)
sudo emerge -av ant
2. If you get an error like “error reading log4j/log4j/jars/log4j-1.2.15.jar; error in opening zip” most likely it is one of two things:
A. Make sure you have log4j installed: sudo emerge -av log4j B. If you have it installed on your system and you are still getting that build error try editing the file below: ./.classpath in the hadoop home directory, and change the location of log4j-1.2....... , it should also be in your forrest-0.8/lib/core/ folder
3. If you get stuck, don’t hesitate to ask on the comments!
I had a fault when building that “hadoop-1.0.2/build.xml:614: Execute failed: java.io.IOException: Cannot run program “autoreconf” (in directory “/usr/lib/eclipse/libs/hadoop-1.0.2/src/native”): java.io.IOException: error=2, No such file or directory”
If you can help me I will be really happy.
Fuat said this on April 27, 2012 at 3:43 pm |
Do you have autoreconf isntalled?
try running it:
autoreconf -h
If it is not installed, this is what my emerge says autoreconf:
tripl3fault@tripl3fault-gentoo ~ $ equery belongs -e autoreconf
* Searching for autoreconf …
sys-devel/autoconf-wrapper-12 (/usr/bin/autoreconf -> ../lib64/misc/ac-wrapper.sh)
which means it is part of a wrapper managing the many versions of autoconf, and it shows that autoreconf is a symbolic link to that script that manages the versions (don’t make the link your self). Just install the wrapper, and you should be fine:
sudo emerge -av autoconf-wrapper
These are the packages that would be merged, in order:
Calculating dependencies… done!
[ebuild R ] sys-devel/autoconf-wrapper-12 0 kB
If you do have autoreconf installed and autoreconf -h works and you are still getting that error, please dump your settings, i.e. your hadoop directory, your eclipse directory etc.
Hope that helps! Cheers!
tripl3fault said this on April 27, 2012 at 10:48 pm |
thank you for your help I handled autoconf but now i am havin a problem like error: error reading /root/.ivy2/cache/log4j/log4j/jars/log4j-1.2.15.jar; error in opening zip file
I hope you can help me.
fuat said this on April 28, 2012 at 1:23 pm
Sure glad to help, I already had it installed on my machine, I will add how to install log4j since it is a build req into the post
tripl3fault said this on April 28, 2012 at 2:06 pm
I still have this error:
error: datatype library “http://www.w3.org/2001/XMLSchema-datatypes” not recognized
I’m not able to solve it…please help me…i’m spending to much time to compile a stupid plugin!
Thank you very much first of all for your guide, one of the best i have found, and second thank you in advance for your help with my problem.
Matteo.
Matteo said this on June 25, 2012 at 2:49 pm |
Hi,
Are you running Java 6? Make sure your Java home points to Java 5 for the build. Does that fix it?
tripl3fault said this on July 15, 2012 at 4:55 pm |
autoreconf: /usr/bin/autoconf failed with exit status: 1
I got this error when I tried to install 1.0.2 .
I tried installing autoconf too but still the same error.Thanks
Pavan said this on July 3, 2012 at 6:13 am |
I also had Execute failed: java.io.IOException: Cannot run program “autoreconf” (in directory “/usr/lib/eclipse/libs/hadoop-1.0.2/src/native”): java.io.IOException: error=2, No such file or directory when building hadoop 1.0.3, but in this case I had autoreconf on my path.
In case it helps others:
1. Ultimately this call ends up in the windows API ::CreateProcess() – this call does not understand/execute shell scripts, in fact it will only execute .EXEs, so I changed the build.xml to invoke bash (which is an .EXE) first.
2. Then got hit by this bug with autoconf:
http://forums.gentoo.org/viewtopic-p-6830420.html
Rather than patch the wrapper script as suggested (and looks like incoporated into future autoconf releases), I just added the full path to get me past this and through to the next problem:
3. Had to add automake and libtool packages to cygwin to get this to completely work.
Now off to the next problem…
Alex Raitt said this on July 12, 2012 at 6:37 pm |