Need help getting Accumulo running.

classic Classic list List threaded Threaded
32 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Need help getting Accumulo running.

Park, Jee [USA]

Hi,

 

I had trouble getting Accumulo to work on a VM instance of Ubuntu (11.04) using this guide: https://gist.github.com/1535657.

Does anyone have a step-by-step guide to get it running on either Ubuntu or Windows 7?

 

Thanks!


smime.p7s (11K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Need help getting Accumulo running.

William Slacum-2
What, specifically, is giving you trouble?

On Fri, Jun 29, 2012 at 11:27 AM, Park, Jee [USA] <[hidden email]> wrote:

> Hi, ****
>
> ** **
>
> I had trouble getting Accumulo to work on a VM instance of Ubuntu (11.04)
> using this guide: https://gist.github.com/1535657.****
>
> Does anyone have a step-by-step guide to get it running on either Ubuntu
> or Windows 7?****
>
> ** **
>
> Thanks!****
>
Reply | Threaded
Open this post in threaded view
|

Re: Need help getting Accumulo running.

John Vines
In reply to this post by Park, Jee [USA]
We currently don't really support running on Windows. I'm sure there are
ways to get it running with Cygwin, but our efforts are better spend in
other directions for now.

As for getting it going in Ubuntu, I haven't seen that guide before. Can
you let me know where it broke?

For the record, when I was developing ACCUMULO-404, I was working in Ubuntu
VMs and I used Apache-BigTop and our debians to facilitate installation.
They don't do everything for you, but I think if you use 1.4.1 (not sure if
I got the debs into 1..4.0), it should diminish the installation work you
must do to some minor configuration.

John

On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]> wrote:

> Hi, ****
>
> ** **
>
> I had trouble getting Accumulo to work on a VM instance of Ubuntu (11.04)
> using this guide: https://gist.github.com/1535657.****
>
> Does anyone have a step-by-step guide to get it running on either Ubuntu
> or Windows 7?****
>
> ** **
>
> Thanks!****
>
Reply | Threaded
Open this post in threaded view
|

Re: Need help getting Accumulo running.

Miguel Pereira
Hi Jee,

I used that same guide to install Accumulo, but I used this guide to
install hadoop.

http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

furthermore here are the steps I took to install accumulo were I used
version 1.4.0 and standalone conf.
please note you also need to install java jdk, and set your JAVA_HOME i
used jdk 1.7

Setting up Accumulo


   - git clone     git://github.com/apache/accumulo.git
   - cd accumulo
   - git checkout     tags/1.4.0 -b 1.4.0
   - mvn package && mvn assembly:single -N.             // this can take a
   while
   - cp conf/examples/512MB/standalone/* conf
   - vi accumulo-env.sh


test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
test -z "$HADOOP_HOME" && export
HADOOP_HOME=/home/hduser/developer/workspace/hadoop
test -z "$ZOOKEEPER_HOME" && export
ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5

   - vi     accumulo-site.xml


    modify user, password, secret, memory


   - bin/accumulo     init
   - bin/start-all.sh
   - bin/accumulo     shell -u root

if you get the shell up you know your good.


On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:

> We currently don't really support running on Windows. I'm sure there are
> ways to get it running with Cygwin, but our efforts are better spend in
> other directions for now.
>
> As for getting it going in Ubuntu, I haven't seen that guide before. Can
> you let me know where it broke?
>
> For the record, when I was developing ACCUMULO-404, I was working in Ubuntu
> VMs and I used Apache-BigTop and our debians to facilitate installation.
> They don't do everything for you, but I think if you use 1.4.1 (not sure if
> I got the debs into 1..4.0), it should diminish the installation work you
> must do to some minor configuration.
>
> John
>
> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]> wrote:
>
> > Hi, ****
> >
> > ** **
> >
> > I had trouble getting Accumulo to work on a VM instance of Ubuntu (11.04)
> > using this guide: https://gist.github.com/1535657.****
> >
> > Does anyone have a step-by-step guide to get it running on either Ubuntu
> > or Windows 7?****
> >
> > ** **
> >
> > Thanks!****
> >
>
Reply | Threaded
Open this post in threaded view
|

Re: Need help getting Accumulo running.

David Medinets
oh... I think you missed a few steps from the gist:

$ cd ~
$ export TAR_DIR=~/workspace/accumulo/src/assemble/target
$ tar xvzf $TAR_DIR/accumulo-1.5.0-incubating-SNAPSHOT-dist.tar.gz

# Add the following to your .bashrc file.
$ export ACCUMULO_HOME=~/accumulo-1.5.0-incubating-SNAPSHOT

$ cd $ACCUMULO_HOME/conf

These are the steps where you unpack the newly-created gz file into
your home directory. It seems like you are running Accumulo from the
source code directory. Also notice that I wrote those steps for v1.5.0
which might be different from v1.4.0

On Fri, Jun 29, 2012 at 2:58 PM, Miguel Pereira
<[hidden email]> wrote:

> Hi Jee,
>
> I used that same guide to install Accumulo, but I used this guide to
> install hadoop.
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>
> furthermore here are the steps I took to install accumulo were I used
> version 1.4.0 and standalone conf.
> please note you also need to install java jdk, and set your JAVA_HOME i
> used jdk 1.7
>
> Setting up Accumulo
>
>
>   - git clone     git://github.com/apache/accumulo.git
>   - cd accumulo
>   - git checkout     tags/1.4.0 -b 1.4.0
>   - mvn package && mvn assembly:single -N.             // this can take a
>   while
>   - cp conf/examples/512MB/standalone/* conf
>   - vi accumulo-env.sh
>
>
> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> test -z "$HADOOP_HOME" && export
> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> test -z "$ZOOKEEPER_HOME" && export
> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>
>   - vi     accumulo-site.xml
>
>
>    modify user, password, secret, memory
>
>
>   - bin/accumulo     init
>   - bin/start-all.sh
>   - bin/accumulo     shell -u root
>
> if you get the shell up you know your good.
>
>
> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:
>
>> We currently don't really support running on Windows. I'm sure there are
>> ways to get it running with Cygwin, but our efforts are better spend in
>> other directions for now.
>>
>> As for getting it going in Ubuntu, I haven't seen that guide before. Can
>> you let me know where it broke?
>>
>> For the record, when I was developing ACCUMULO-404, I was working in Ubuntu
>> VMs and I used Apache-BigTop and our debians to facilitate installation.
>> They don't do everything for you, but I think if you use 1.4.1 (not sure if
>> I got the debs into 1..4.0), it should diminish the installation work you
>> must do to some minor configuration.
>>
>> John
>>
>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]> wrote:
>>
>> > Hi, ****
>> >
>> > ** **
>> >
>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu (11.04)
>> > using this guide: https://gist.github.com/1535657.****
>> >
>> > Does anyone have a step-by-step guide to get it running on either Ubuntu
>> > or Windows 7?****
>> >
>> > ** **
>> >
>> > Thanks!****
>> >
>>
Reply | Threaded
Open this post in threaded view
|

RE: [External] Re: Need help getting Accumulo running.

Park, Jee [USA]
In reply to this post by Miguel Pereira
Thanks everyone for the responses!

So, I got hadoop to run and installed accumulo following Miguel's email, the
problem now is that when I do

$ bin/accumulo init

It tries to connect a few times and then times out. Here is what it prints
out.
Just to let you know I did not change anything in the accumulo-site.xml file

Thanks,
Jee

hduser@ubuntu:~/accumulo$ bin/accumulo init
02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 0 time(s).
02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 1 time(s).
02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 2 time(s).
02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 3 time(s).
02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 4 time(s).
02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 5 time(s).
02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 6 time(s).
02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 7 time(s).
02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 8 time(s).
02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 9 time(s).
02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException: Call to
localhost/127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused
java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
at org.apache.hadoop.ipc.Client.call(Client.java:743)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at $Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
m.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.accumulo.start.Main$1.run(Main.java:89)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
06)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
at org.apache.hadoop.ipc.Client.call(Client.java:720)
... 20 more
Thread "init" died null
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.accumulo.start.Main$1.run(Main.java:89)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.RuntimeException: java.net.ConnectException: Call to
localhost/127.0.0.1:54310 failed on connection exception:
java.net.ConnectException: Connection refused
at org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
... 6 more
Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310
failed on connection exception: java.net.ConnectException: Connection
refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
at org.apache.hadoop.ipc.Client.call(Client.java:743)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at $Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
m.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
... 6 more
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
06)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
at org.apache.hadoop.ipc.Client.call(Client.java:720)
... 20 more

-----Original Message-----
From: Miguel Pereira [mailto:[hidden email]]
Sent: Friday, June 29, 2012 2:59 PM
To: [hidden email]
Subject: [External] Re: Need help getting Accumulo running.

Hi Jee,

I used that same guide to install Accumulo, but I used this guide to install
hadoop.

http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-
node-cluster/

furthermore here are the steps I took to install accumulo were I used
version 1.4.0 and standalone conf.
please note you also need to install java jdk, and set your JAVA_HOME i used
jdk 1.7

Setting up Accumulo


   - git clone     git://github.com/apache/accumulo.git
   - cd accumulo
   - git checkout     tags/1.4.0 -b 1.4.0
   - mvn package && mvn assembly:single -N.             // this can take a
   while
   - cp conf/examples/512MB/standalone/* conf
   - vi accumulo-env.sh


test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
test -z "$HADOOP_HOME" && export
HADOOP_HOME=/home/hduser/developer/workspace/hadoop
test -z "$ZOOKEEPER_HOME" && export
ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5

   - vi     accumulo-site.xml


    modify user, password, secret, memory


   - bin/accumulo     init
   - bin/start-all.sh
   - bin/accumulo     shell -u root

if you get the shell up you know your good.


On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:

> We currently don't really support running on Windows. I'm sure there
> are ways to get it running with Cygwin, but our efforts are better
> spend in other directions for now.
>
> As for getting it going in Ubuntu, I haven't seen that guide before.
> Can you let me know where it broke?
>
> For the record, when I was developing ACCUMULO-404, I was working in
> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
installation.

> They don't do everything for you, but I think if you use 1.4.1 (not
> sure if I got the debs into 1..4.0), it should diminish the
> installation work you must do to some minor configuration.
>
> John
>
> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]> wrote:
>
> > Hi, ****
> >
> > ** **
> >
> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> > (11.04) using this guide: https://gist.github.com/1535657.****
> >
> > Does anyone have a step-by-step guide to get it running on either
> > Ubuntu or Windows 7?****
> >
> > ** **
> >
> > Thanks!****
> >
>

smime.p7s (11K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: [External] Re: Need help getting Accumulo running.

Jim Klucar
Did you verify that zookeeper is running?

On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:

> Thanks everyone for the responses!
>
> So, I got hadoop to run and installed accumulo following Miguel's email, the
> problem now is that when I do
>
> $ bin/accumulo init
>
> It tries to connect a few times and then times out. Here is what it prints
> out.
> Just to let you know I did not change anything in the accumulo-site.xml file
>
> Thanks,
> Jee
>
> hduser@ubuntu:~/accumulo$ bin/accumulo init
> 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 0 time(s).
> 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 1 time(s).
> 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 2 time(s).
> 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 3 time(s).
> 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 4 time(s).
> 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 5 time(s).
> 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 6 time(s).
> 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 7 time(s).
> 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 8 time(s).
> 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 9 time(s).
> 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException: Call to
> localhost/127.0.0.1:54310 failed on connection exception:
> java.net.ConnectException: Connection refused
> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
> connection exception: java.net.ConnectException: Connection refused
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
> m.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
> )
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.accumulo.start.Main$1.run(Main.java:89)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
> 06)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> at org.apache.hadoop.ipc.Client.call(Client.java:720)
> ... 20 more
> Thread "init" died null
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
> )
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.accumulo.start.Main$1.run(Main.java:89)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.lang.RuntimeException: java.net.ConnectException: Call to
> localhost/127.0.0.1:54310 failed on connection exception:
> java.net.ConnectException: Connection refused
> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> ... 6 more
> Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310
> failed on connection exception: java.net.ConnectException: Connection
> refused
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
> m.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> ... 6 more
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
> 06)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> at org.apache.hadoop.ipc.Client.call(Client.java:720)
> ... 20 more
>
> -----Original Message-----
> From: Miguel Pereira [mailto:[hidden email]]
> Sent: Friday, June 29, 2012 2:59 PM
> To: [hidden email]
> Subject: [External] Re: Need help getting Accumulo running.
>
> Hi Jee,
>
> I used that same guide to install Accumulo, but I used this guide to install
> hadoop.
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-
> node-cluster/
>
> furthermore here are the steps I took to install accumulo were I used
> version 1.4.0 and standalone conf.
> please note you also need to install java jdk, and set your JAVA_HOME i used
> jdk 1.7
>
> Setting up Accumulo
>
>
>    - git clone     git://github.com/apache/accumulo.git
>    - cd accumulo
>    - git checkout     tags/1.4.0 -b 1.4.0
>    - mvn package && mvn assembly:single -N.             // this can take a
>    while
>    - cp conf/examples/512MB/standalone/* conf
>    - vi accumulo-env.sh
>
>
> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> test -z "$HADOOP_HOME" && export
> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> test -z "$ZOOKEEPER_HOME" && export
> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>
>    - vi     accumulo-site.xml
>
>
>     modify user, password, secret, memory
>
>
>    - bin/accumulo     init
>    - bin/start-all.sh
>    - bin/accumulo     shell -u root
>
> if you get the shell up you know your good.
>
>
> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:
>
>> We currently don't really support running on Windows. I'm sure there
>> are ways to get it running with Cygwin, but our efforts are better
>> spend in other directions for now.
>>
>> As for getting it going in Ubuntu, I haven't seen that guide before.
>> Can you let me know where it broke?
>>
>> For the record, when I was developing ACCUMULO-404, I was working in
>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> installation.
>> They don't do everything for you, but I think if you use 1.4.1 (not
>> sure if I got the debs into 1..4.0), it should diminish the
>> installation work you must do to some minor configuration.
>>
>> John
>>
>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]> wrote:
>>
>> > Hi, ****
>> >
>> > ** **
>> >
>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>> > (11.04) using this guide: https://gist.github.com/1535657.****
>> >
>> > Does anyone have a step-by-step guide to get it running on either
>> > Ubuntu or Windows 7?****
>> >
>> > ** **
>> >
>> > Thanks!****
>> >
>>
Reply | Threaded
Open this post in threaded view
|

Re: [External] Re: Need help getting Accumulo running.

Miguel Pereira
Jee, I think I got that error before, 2 things

1) what Jim said
2) make sure that your environment variables are setup properbly, in my
example the HOME directories might not match up with yours
3) make sure your hadoop file hdfs-site.xml in the hadoop/conf dir has the
the name and data directories set up.. this screwed me up before

<property>
  <name>dfs.name.dir</name>
  <value>/home/blue/dfs/name</value>
  <description>
  </description>
</property>

<property>
  <name>dfs.data.dir</name>
  <value>/home/blue/dfs/data</value>
  <description>
  </description>
</property>

the directories might not match your environment, you can basically use
what ever you want.
then reformat your name node and restart accumulo

hope that helps
- Miguel

On Mon, Jul 2, 2012 at 1:24 PM, Jim Klucar <[hidden email]> wrote:

> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's email,
> the
> > problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it
> prints
> > out.
> > Just to let you know I did not change anything in the accumulo-site.xml
> file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException: Call
> to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
> > connection exception: java.net.ConnectException: Connection refused
> > at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source)
> > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
> > m.java:82)
> > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
> > at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
> > )
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused
> > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> > at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> >
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
> > at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
> > )
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > at org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310
> > failed on connection exception: java.net.ConnectException: Connection
> > refused
> > at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source)
> > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
> > m.java:82)
> > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
> > at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused
> > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> > at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> >
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
> > at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:[hidden email]]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: [hidden email]
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to
> install
> > hadoop.
> >
> >
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I used
> > version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your JAVA_HOME i
> used
> > jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can take
> a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure there
> >> are ways to get it running with Cygwin, but our efforts are better
> >> spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working in
> >> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not
> >> sure if I got the debs into 1..4.0), it should diminish the
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>
Reply | Threaded
Open this post in threaded view
|

Re: [External] Re: Need help getting Accumulo running.

Eric Newton
In reply to this post by Jim Klucar
The call stack indicates that Accumulo cannot talk to the Hadoop Name Node.

Verify hadoop is up and running.  Oh, and don't put any of your
Hadoop/Zookeeper data in /tmp, which is cleaned upon a reboot.

-Eric

On Mon, Jul 2, 2012 at 1:24 PM, Jim Klucar <[hidden email]> wrote:

> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:
>> Thanks everyone for the responses!
>>
>> So, I got hadoop to run and installed accumulo following Miguel's email, the
>> problem now is that when I do
>>
>> $ bin/accumulo init
>>
>> It tries to connect a few times and then times out. Here is what it prints
>> out.
>> Just to let you know I did not change anything in the accumulo-site.xml file
>>
>> Thanks,
>> Jee
>>
>> hduser@ubuntu:~/accumulo$ bin/accumulo init
>> 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 0 time(s).
>> 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 1 time(s).
>> 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 2 time(s).
>> 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 3 time(s).
>> 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 4 time(s).
>> 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 5 time(s).
>> 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 6 time(s).
>> 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 7 time(s).
>> 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 8 time(s).
>> 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
>> localhost/127.0.0.1:54310. Already tried 9 time(s).
>> 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException: Call to
>> localhost/127.0.0.1:54310 failed on connection exception:
>> java.net.ConnectException: Connection refused
>> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
>> connection exception: java.net.ConnectException: Connection refused
>> at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> at $Proxy0.getProtocolVersion(Unknown Source)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>> at
>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
>> m.java:82)
>> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>> at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
>> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
>> )
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
>> .java:43)
>> at java.lang.reflect.Method.invoke(Method.java:601)
>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>> at java.lang.Thread.run(Thread.java:722)
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
>> 06)
>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
>> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>> ... 20 more
>> Thread "init" died null
>> java.lang.reflect.InvocationTargetException
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57
>> )
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
>> .java:43)
>> at java.lang.reflect.Method.invoke(Method.java:601)
>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>> at java.lang.Thread.run(Thread.java:722)
>> Caused by: java.lang.RuntimeException: java.net.ConnectException: Call to
>> localhost/127.0.0.1:54310 failed on connection exception:
>> java.net.ConnectException: Connection refused
>> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
>> ... 6 more
>> Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310
>> failed on connection exception: java.net.ConnectException: Connection
>> refused
>> at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> at $Proxy0.getProtocolVersion(Unknown Source)
>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>> at
>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSyste
>> m.java:82)
>> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>> at org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554)
>> at org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>> ... 6 more
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:2
>> 06)
>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
>> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>> ... 20 more
>>
>> -----Original Message-----
>> From: Miguel Pereira [mailto:[hidden email]]
>> Sent: Friday, June 29, 2012 2:59 PM
>> To: [hidden email]
>> Subject: [External] Re: Need help getting Accumulo running.
>>
>> Hi Jee,
>>
>> I used that same guide to install Accumulo, but I used this guide to install
>> hadoop.
>>
>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-
>> node-cluster/
>>
>> furthermore here are the steps I took to install accumulo were I used
>> version 1.4.0 and standalone conf.
>> please note you also need to install java jdk, and set your JAVA_HOME i used
>> jdk 1.7
>>
>> Setting up Accumulo
>>
>>
>>    - git clone     git://github.com/apache/accumulo.git
>>    - cd accumulo
>>    - git checkout     tags/1.4.0 -b 1.4.0
>>    - mvn package && mvn assembly:single -N.             // this can take a
>>    while
>>    - cp conf/examples/512MB/standalone/* conf
>>    - vi accumulo-env.sh
>>
>>
>> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>> test -z "$HADOOP_HOME" && export
>> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>> test -z "$ZOOKEEPER_HOME" && export
>> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>>
>>    - vi     accumulo-site.xml
>>
>>
>>     modify user, password, secret, memory
>>
>>
>>    - bin/accumulo     init
>>    - bin/start-all.sh
>>    - bin/accumulo     shell -u root
>>
>> if you get the shell up you know your good.
>>
>>
>> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:
>>
>>> We currently don't really support running on Windows. I'm sure there
>>> are ways to get it running with Cygwin, but our efforts are better
>>> spend in other directions for now.
>>>
>>> As for getting it going in Ubuntu, I haven't seen that guide before.
>>> Can you let me know where it broke?
>>>
>>> For the record, when I was developing ACCUMULO-404, I was working in
>>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
>> installation.
>>> They don't do everything for you, but I think if you use 1.4.1 (not
>>> sure if I got the debs into 1..4.0), it should diminish the
>>> installation work you must do to some minor configuration.
>>>
>>> John
>>>
>>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]> wrote:
>>>
>>> > Hi, ****
>>> >
>>> > ** **
>>> >
>>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>>> > (11.04) using this guide: https://gist.github.com/1535657.****
>>> >
>>> > Does anyone have a step-by-step guide to get it running on either
>>> > Ubuntu or Windows 7?****
>>> >
>>> > ** **
>>> >
>>> > Thanks!****
>>> >
>>>
Reply | Threaded
Open this post in threaded view
|

RE: [External] Re: Need help getting Accumulo running.

Park, Jee [USA]
In reply to this post by Jim Klucar
Ah, so I realized I wasn’t running hadoop or zookeeper, and so I am running
hadoop, but cannot get zookeeper to run
Here is what I did:

$ $ZOOKEEPER_HOME/bin/zkServer.sh start  
JMX enabled by default
Using config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
/usr/lib/zookeeper/bin/zkServer.sh: Cannot create
/var/zookeeper/zookeeper_server.pid: Permission denied
FAILED TO WRITE PID


-----Original Message-----
From: Jim Klucar [mailto:[hidden email]]
Sent: Monday, July 02, 2012 1:25 PM
To: [hidden email]
Subject: Re: [External] Re: Need help getting Accumulo running.

Did you verify that zookeeper is running?

On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:

> Thanks everyone for the responses!
>
> So, I got hadoop to run and installed accumulo following Miguel's
> email, the problem now is that when I do
>
> $ bin/accumulo init
>
> It tries to connect a few times and then times out. Here is what it
> prints out.
> Just to let you know I did not change anything in the
> accumulo-site.xml file
>
> Thanks,
> Jee
>
> hduser@ubuntu:~/accumulo$ bin/accumulo init
> 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 0 time(s).
> 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 1 time(s).
> 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 2 time(s).
> 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 3 time(s).
> 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 4 time(s).
> 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 5 time(s).
> 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 6 time(s).
> 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 7 time(s).
> 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 8 time(s).
> 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> localhost/127.0.0.1:54310. Already tried 9 time(s).
> 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
> Call to
> localhost/127.0.0.1:54310 failed on connection exception:
> java.net.ConnectException: Connection refused
> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
> connection exception: java.net.ConnectException: Connection refused at
> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source) at
> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> eSyste
> m.java:82)
> at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at
> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> ) at
> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57
> )
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> orImpl
> .java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.accumulo.start.Main$1.run(Main.java:89)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.net.ConnectException: Connection refused at
> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> java:2
> 06)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> ) at
> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> at org.apache.hadoop.ipc.Client.call(Client.java:720)
> ... 20 more
> Thread "init" died null
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> ava:57
> )
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> orImpl
> .java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.accumulo.start.Main$1.run(Main.java:89)
> at java.lang.Thread.run(Thread.java:722)
> Caused by: java.lang.RuntimeException: java.net.ConnectException: Call
> to
> localhost/127.0.0.1:54310 failed on connection exception:
> java.net.ConnectException: Connection refused at
> org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> ... 6 more
> Caused by: java.net.ConnectException: Call to
> localhost/127.0.0.1:54310 failed on connection exception:
> java.net.ConnectException: Connection refused at
> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source) at
> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> eSyste
> m.java:82)
> at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at
> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> ) at
> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> ... 6 more
> Caused by: java.net.ConnectException: Connection refused at
> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> java:2
> 06)
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> ) at
> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> at org.apache.hadoop.ipc.Client.call(Client.java:720)
> ... 20 more
>
> -----Original Message-----
> From: Miguel Pereira [mailto:[hidden email]]
> Sent: Friday, June 29, 2012 2:59 PM
> To: [hidden email]
> Subject: [External] Re: Need help getting Accumulo running.
>
> Hi Jee,
>
> I used that same guide to install Accumulo, but I used this guide to
> install hadoop.
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-s
> ingle-
> node-cluster/
>
> furthermore here are the steps I took to install accumulo were I used
> version 1.4.0 and standalone conf.
> please note you also need to install java jdk, and set your JAVA_HOME
> i used jdk 1.7
>
> Setting up Accumulo
>
>
>    - git clone     git://github.com/apache/accumulo.git
>    - cd accumulo
>    - git checkout     tags/1.4.0 -b 1.4.0
>    - mvn package && mvn assembly:single -N.             // this can
> take a
>    while
>    - cp conf/examples/512MB/standalone/* conf
>    - vi accumulo-env.sh
>
>
> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> test -z "$HADOOP_HOME" && export
> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> test -z "$ZOOKEEPER_HOME" && export
> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>
>    - vi     accumulo-site.xml
>
>
>     modify user, password, secret, memory
>
>
>    - bin/accumulo     init
>    - bin/start-all.sh
>    - bin/accumulo     shell -u root
>
> if you get the shell up you know your good.
>
>
> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:
>
>> We currently don't really support running on Windows. I'm sure there
>> are ways to get it running with Cygwin, but our efforts are better
>> spend in other directions for now.
>>
>> As for getting it going in Ubuntu, I haven't seen that guide before.
>> Can you let me know where it broke?
>>
>> For the record, when I was developing ACCUMULO-404, I was working in
>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> installation.
>> They don't do everything for you, but I think if you use 1.4.1 (not
>> sure if I got the debs into 1..4.0), it should diminish the
>> installation work you must do to some minor configuration.
>>
>> John
>>
>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
wrote:

>>
>> > Hi, ****
>> >
>> > ** **
>> >
>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>> > (11.04) using this guide: https://gist.github.com/1535657.****
>> >
>> > Does anyone have a step-by-step guide to get it running on either
>> > Ubuntu or Windows 7?****
>> >
>> > ** **
>> >
>> > Thanks!****
>> >
>>

smime.p7s (11K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: [External] Re: Need help getting Accumulo running.

William Slacum-2
Make sure that /var/zookeeper is writable by the user you're launching
Zookeeper as. Alternatively, you can reconfigure zookeeper's zoo.cfg file
to change the directory to somewhere that is writable.

On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <[hidden email]> wrote:

> Ah, so I realized I wasn’t running hadoop or zookeeper, and so I am running
> hadoop, but cannot get zookeeper to run
> Here is what I did:
>
> $ $ZOOKEEPER_HOME/bin/zkServer.sh start
> JMX enabled by default
> Using config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
> /var/zookeeper/zookeeper_server.pid: Permission denied
> FAILED TO WRITE PID
>
>
> -----Original Message-----
> From: Jim Klucar [mailto:[hidden email]]
> Sent: Monday, July 02, 2012 1:25 PM
> To: [hidden email]
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's
> > email, the problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it
> > prints out.
> > Just to let you know I did not change anything in the
> > accumulo-site.xml file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
> > connection exception: java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException: Call
> > to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:[hidden email]]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: [hidden email]
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to
> > install hadoop.
> >
> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-s
> > ingle-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I used
> > version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your JAVA_HOME
> > i used jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can
> > take a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure there
> >> are ways to get it running with Cygwin, but our efforts are better
> >> spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working in
> >> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not
> >> sure if I got the debs into 1..4.0), it should diminish the
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>
Reply | Threaded
Open this post in threaded view
|

Re: [External] Re: Need help getting Accumulo running.

Miguel Pereira
In reply to this post by Park, Jee [USA]
sudo :)

On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <[hidden email]> wrote:

> Ah, so I realized I wasn’t running hadoop or zookeeper, and so I am running
> hadoop, but cannot get zookeeper to run
> Here is what I did:
>
> $ $ZOOKEEPER_HOME/bin/zkServer.sh start
> JMX enabled by default
> Using config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
> /var/zookeeper/zookeeper_server.pid: Permission denied
> FAILED TO WRITE PID
>
>
> -----Original Message-----
> From: Jim Klucar [mailto:[hidden email]]
> Sent: Monday, July 02, 2012 1:25 PM
> To: [hidden email]
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's
> > email, the problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it
> > prints out.
> > Just to let you know I did not change anything in the
> > accumulo-site.xml file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed on
> > connection exception: java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException: Call
> > to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFil
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:554
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:701)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:[hidden email]]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: [hidden email]
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to
> > install hadoop.
> >
> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-s
> > ingle-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I used
> > version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your JAVA_HOME
> > i used jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can
> > take a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure there
> >> are ways to get it running with Cygwin, but our efforts are better
> >> spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working in
> >> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not
> >> sure if I got the debs into 1..4.0), it should diminish the
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>
Reply | Threaded
Open this post in threaded view
|

RE: [External] Re: Need help getting Accumulo running.

Park, Jee [USA]
Haha sudo gave me

sudo: /bin/zkServer.sh: command not found

-----Original Message-----
From: Miguel Pereira [mailto:[hidden email]]
Sent: Monday, July 02, 2012 1:46 PM
To: [hidden email]
Subject: Re: [External] Re: Need help getting Accumulo running.

sudo :)

On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <[hidden email]> wrote:

> Ah, so I realized I wasn't running hadoop or zookeeper, and so I am
> running hadoop, but cannot get zookeeper to run Here is what I did:
>
> $ $ZOOKEEPER_HOME/bin/zkServer.sh start JMX enabled by default Using
> config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
> /var/zookeeper/zookeeper_server.pid: Permission denied FAILED TO WRITE
> PID
>
>
> -----Original Message-----
> From: Jim Klucar [mailto:[hidden email]]
> Sent: Monday, July 02, 2012 1:25 PM
> To: [hidden email]
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's
> > email, the problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it
> > prints out.
> > Just to let you know I did not change anything in the
> > accumulo-site.xml file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed
> > on connection exception: java.net.ConnectException: Connection
> > refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
> > il
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
> > 54
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
> > .j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
> > ss
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
> > 1)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
> > 04
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
> > .j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
> > ss
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
> > il
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
> > 54
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
> > 1)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
> > 04
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:[hidden email]]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: [hidden email]
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to
> > install hadoop.
> >
> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux
> > -s
> > ingle-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I
> > used version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your
> > JAVA_HOME i used jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can
> > take a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export
> > JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure
> >> there are ways to get it running with Cygwin, but our efforts are
> >> better spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working
> >> in Ubuntu VMs and I used Apache-BigTop and our debians to
> >> facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not
> >> sure if I got the debs into 1..4.0), it should diminish the
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>

smime.p7s (11K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: [External] Re: Need help getting Accumulo running.

Jim Klucar
zookeeper is probably not in /bin   You should have $ZOOKEEPER_HOME
set to wherever you installed it:

sudo $ZOOKEEPER_HOME/bin/zkServer.sh start

On Mon, Jul 2, 2012 at 1:46 PM, Park, Jee [USA] <[hidden email]> wrote:

> Haha sudo gave me
>
> sudo: /bin/zkServer.sh: command not found
>
> -----Original Message-----
> From: Miguel Pereira [mailto:[hidden email]]
> Sent: Monday, July 02, 2012 1:46 PM
> To: [hidden email]
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> sudo :)
>
> On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <[hidden email]> wrote:
>
>> Ah, so I realized I wasn't running hadoop or zookeeper, and so I am
>> running hadoop, but cannot get zookeeper to run Here is what I did:
>>
>> $ $ZOOKEEPER_HOME/bin/zkServer.sh start JMX enabled by default Using
>> config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
>> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
>> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
>> /var/zookeeper/zookeeper_server.pid: Permission denied FAILED TO WRITE
>> PID
>>
>>
>> -----Original Message-----
>> From: Jim Klucar [mailto:[hidden email]]
>> Sent: Monday, July 02, 2012 1:25 PM
>> To: [hidden email]
>> Subject: Re: [External] Re: Need help getting Accumulo running.
>>
>> Did you verify that zookeeper is running?
>>
>> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:
>> > Thanks everyone for the responses!
>> >
>> > So, I got hadoop to run and installed accumulo following Miguel's
>> > email, the problem now is that when I do
>> >
>> > $ bin/accumulo init
>> >
>> > It tries to connect a few times and then times out. Here is what it
>> > prints out.
>> > Just to let you know I did not change anything in the
>> > accumulo-site.xml file
>> >
>> > Thanks,
>> > Jee
>> >
>> > hduser@ubuntu:~/accumulo$ bin/accumulo init
>> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 0 time(s).
>> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 1 time(s).
>> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 2 time(s).
>> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 3 time(s).
>> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 4 time(s).
>> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 5 time(s).
>> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 6 time(s).
>> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 7 time(s).
>> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 8 time(s).
>> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
>> > localhost/127.0.0.1:54310. Already tried 9 time(s).
>> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
>> > Call to
>> > localhost/127.0.0.1:54310 failed on connection exception:
>> > java.net.ConnectException: Connection refused
>> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed
>> > on connection exception: java.net.ConnectException: Connection
>> > refused at
>> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
>> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> > at $Proxy0.getProtocolVersion(Unknown Source) at
>> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>> > at
>> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>> > at
>> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>> > il
>> > eSyste
>> > m.java:82)
>> > at
>> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>> > at
>> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>> > 54
>> > ) at
>> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>> > .j
>> > ava:57
>> > )
>> > at
>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>> > ss
>> > orImpl
>> > .java:43)
>> > at java.lang.reflect.Method.invoke(Method.java:601)
>> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
>> > at java.lang.Thread.run(Thread.java:722)
>> > Caused by: java.net.ConnectException: Connection refused at
>> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>> > 1)
>> > at
>> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>> > java:2
>> > 06)
>> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>> > at
>> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>> > 04
>> > ) at
>> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
>> > ... 20 more
>> > Thread "init" died null
>> > java.lang.reflect.InvocationTargetException
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>> > .j
>> > ava:57
>> > )
>> > at
>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>> > ss
>> > orImpl
>> > .java:43)
>> > at java.lang.reflect.Method.invoke(Method.java:601)
>> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
>> > at java.lang.Thread.run(Thread.java:722)
>> > Caused by: java.lang.RuntimeException: java.net.ConnectException:
>> > Call to
>> > localhost/127.0.0.1:54310 failed on connection exception:
>> > java.net.ConnectException: Connection refused at
>> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
>> > ... 6 more
>> > Caused by: java.net.ConnectException: Call to
>> > localhost/127.0.0.1:54310 failed on connection exception:
>> > java.net.ConnectException: Connection refused at
>> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
>> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> > at $Proxy0.getProtocolVersion(Unknown Source) at
>> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>> > at
>> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>> > at
>> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>> > il
>> > eSyste
>> > m.java:82)
>> > at
>> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>> > at
>> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>> > 54
>> > ) at
>> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>> > ... 6 more
>> > Caused by: java.net.ConnectException: Connection refused at
>> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>> > 1)
>> > at
>> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>> > java:2
>> > 06)
>> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>> > at
>> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>> > 04
>> > ) at
>> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
>> > ... 20 more
>> >
>> > -----Original Message-----
>> > From: Miguel Pereira [mailto:[hidden email]]
>> > Sent: Friday, June 29, 2012 2:59 PM
>> > To: [hidden email]
>> > Subject: [External] Re: Need help getting Accumulo running.
>> >
>> > Hi Jee,
>> >
>> > I used that same guide to install Accumulo, but I used this guide to
>> > install hadoop.
>> >
>> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux
>> > -s
>> > ingle-
>> > node-cluster/
>> >
>> > furthermore here are the steps I took to install accumulo were I
>> > used version 1.4.0 and standalone conf.
>> > please note you also need to install java jdk, and set your
>> > JAVA_HOME i used jdk 1.7
>> >
>> > Setting up Accumulo
>> >
>> >
>> >    - git clone     git://github.com/apache/accumulo.git
>> >    - cd accumulo
>> >    - git checkout     tags/1.4.0 -b 1.4.0
>> >    - mvn package && mvn assembly:single -N.             // this can
>> > take a
>> >    while
>> >    - cp conf/examples/512MB/standalone/* conf
>> >    - vi accumulo-env.sh
>> >
>> >
>> > test -z "$JAVA_HOME" && export
>> > JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>> > test -z "$HADOOP_HOME" && export
>> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>> > test -z "$ZOOKEEPER_HOME" && export
>> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>> >
>> >    - vi     accumulo-site.xml
>> >
>> >
>> >     modify user, password, secret, memory
>> >
>> >
>> >    - bin/accumulo     init
>> >    - bin/start-all.sh
>> >    - bin/accumulo     shell -u root
>> >
>> > if you get the shell up you know your good.
>> >
>> >
>> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]>
>> wrote:
>> >
>> >> We currently don't really support running on Windows. I'm sure
>> >> there are ways to get it running with Cygwin, but our efforts are
>> >> better spend in other directions for now.
>> >>
>> >> As for getting it going in Ubuntu, I haven't seen that guide before.
>> >> Can you let me know where it broke?
>> >>
>> >> For the record, when I was developing ACCUMULO-404, I was working
>> >> in Ubuntu VMs and I used Apache-BigTop and our debians to
>> >> facilitate
>> > installation.
>> >> They don't do everything for you, but I think if you use 1.4.1 (not
>> >> sure if I got the debs into 1..4.0), it should diminish the
>> >> installation work you must do to some minor configuration.
>> >>
>> >> John
>> >>
>> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
>> wrote:
>> >>
>> >> > Hi, ****
>> >> >
>> >> > ** **
>> >> >
>> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>> >> > (11.04) using this guide: https://gist.github.com/1535657.****
>> >> >
>> >> > Does anyone have a step-by-step guide to get it running on either
>> >> > Ubuntu or Windows 7?****
>> >> >
>> >> > ** **
>> >> >
>> >> > Thanks!****
>> >> >
>> >>
>>
Reply | Threaded
Open this post in threaded view
|

RE: [External] Re: Need help getting Accumulo running.

Park, Jee [USA]
In reply to this post by William Slacum-2
Hi, I used sudo chmod a-x /var/zookeeper, and still am getting permission
denied
How do I make sure /var/zookeeper is writable?

-----Original Message-----
From: William Slacum [mailto:[hidden email]]
Sent: Monday, July 02, 2012 1:45 PM
To: [hidden email]
Subject: Re: [External] Re: Need help getting Accumulo running.

Make sure that /var/zookeeper is writable by the user you're launching
Zookeeper as. Alternatively, you can reconfigure zookeeper's zoo.cfg file to
change the directory to somewhere that is writable.

On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <[hidden email]> wrote:

> Ah, so I realized I wasn't running hadoop or zookeeper, and so I am
> running hadoop, but cannot get zookeeper to run Here is what I did:
>
> $ $ZOOKEEPER_HOME/bin/zkServer.sh start JMX enabled by default Using
> config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
> /var/zookeeper/zookeeper_server.pid: Permission denied FAILED TO WRITE
> PID
>
>
> -----Original Message-----
> From: Jim Klucar [mailto:[hidden email]]
> Sent: Monday, July 02, 2012 1:25 PM
> To: [hidden email]
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Did you verify that zookeeper is running?
>
> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:
> > Thanks everyone for the responses!
> >
> > So, I got hadoop to run and installed accumulo following Miguel's
> > email, the problem now is that when I do
> >
> > $ bin/accumulo init
> >
> > It tries to connect a few times and then times out. Here is what it
> > prints out.
> > Just to let you know I did not change anything in the
> > accumulo-site.xml file
> >
> > Thanks,
> > Jee
> >
> > hduser@ubuntu:~/accumulo$ bin/accumulo init
> > 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 0 time(s).
> > 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 1 time(s).
> > 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 2 time(s).
> > 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 3 time(s).
> > 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 4 time(s).
> > 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 5 time(s).
> > 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 6 time(s).
> > 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 7 time(s).
> > 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 8 time(s).
> > 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
> > localhost/127.0.0.1:54310. Already tried 9 time(s).
> > 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused
> > java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed
> > on connection exception: java.net.ConnectException: Connection
> > refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
> > il
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
> > 54
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
> > .j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
> > ss
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
> > 1)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
> > 04
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> > Thread "init" died null
> > java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
> > .j
> > ava:57
> > )
> > at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
> > ss
> > orImpl
> > .java:43)
> > at java.lang.reflect.Method.invoke(Method.java:601)
> > at org.apache.accumulo.start.Main$1.run(Main.java:89)
> > at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.RuntimeException: java.net.ConnectException:
> > Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
> > ... 6 more
> > Caused by: java.net.ConnectException: Call to
> > localhost/127.0.0.1:54310 failed on connection exception:
> > java.net.ConnectException: Connection refused at
> > org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
> > at org.apache.hadoop.ipc.Client.call(Client.java:743)
> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> > at $Proxy0.getProtocolVersion(Unknown Source) at
> > org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> > at
> > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
> > 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> > at
> > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
> > il
> > eSyste
> > m.java:82)
> > at
> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
> > 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> > at
> > org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
> > 54
> > ) at
> > org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
> > ... 6 more
> > Caused by: java.net.ConnectException: Connection refused at
> > sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
> > sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
> > 1)
> > at
> > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
> > java:2
> > 06)
> > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
> > at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
> > 04
> > ) at
> > org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
> > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
> > at org.apache.hadoop.ipc.Client.call(Client.java:720)
> > ... 20 more
> >
> > -----Original Message-----
> > From: Miguel Pereira [mailto:[hidden email]]
> > Sent: Friday, June 29, 2012 2:59 PM
> > To: [hidden email]
> > Subject: [External] Re: Need help getting Accumulo running.
> >
> > Hi Jee,
> >
> > I used that same guide to install Accumulo, but I used this guide to
> > install hadoop.
> >
> > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux
> > -s
> > ingle-
> > node-cluster/
> >
> > furthermore here are the steps I took to install accumulo were I
> > used version 1.4.0 and standalone conf.
> > please note you also need to install java jdk, and set your
> > JAVA_HOME i used jdk 1.7
> >
> > Setting up Accumulo
> >
> >
> >    - git clone     git://github.com/apache/accumulo.git
> >    - cd accumulo
> >    - git checkout     tags/1.4.0 -b 1.4.0
> >    - mvn package && mvn assembly:single -N.             // this can
> > take a
> >    while
> >    - cp conf/examples/512MB/standalone/* conf
> >    - vi accumulo-env.sh
> >
> >
> > test -z "$JAVA_HOME" && export
> > JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> > test -z "$HADOOP_HOME" && export
> > HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> > test -z "$ZOOKEEPER_HOME" && export
> > ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
> >
> >    - vi     accumulo-site.xml
> >
> >
> >     modify user, password, secret, memory
> >
> >
> >    - bin/accumulo     init
> >    - bin/start-all.sh
> >    - bin/accumulo     shell -u root
> >
> > if you get the shell up you know your good.
> >
> >
> > On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]>
> wrote:
> >
> >> We currently don't really support running on Windows. I'm sure
> >> there are ways to get it running with Cygwin, but our efforts are
> >> better spend in other directions for now.
> >>
> >> As for getting it going in Ubuntu, I haven't seen that guide before.
> >> Can you let me know where it broke?
> >>
> >> For the record, when I was developing ACCUMULO-404, I was working
> >> in Ubuntu VMs and I used Apache-BigTop and our debians to
> >> facilitate
> > installation.
> >> They don't do everything for you, but I think if you use 1.4.1 (not
> >> sure if I got the debs into 1..4.0), it should diminish the
> >> installation work you must do to some minor configuration.
> >>
> >> John
> >>
> >> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
> wrote:
> >>
> >> > Hi, ****
> >> >
> >> > ** **
> >> >
> >> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
> >> > (11.04) using this guide: https://gist.github.com/1535657.****
> >> >
> >> > Does anyone have a step-by-step guide to get it running on either
> >> > Ubuntu or Windows 7?****
> >> >
> >> > ** **
> >> >
> >> > Thanks!****
> >> >
> >>
>

smime.p7s (11K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: [External] Re: Need help getting Accumulo running.

Jim Klucar
a-x will remove execute permissions for all hiding that directory.
just do sudo chmod 777 -R /var/zookeeper to open up permissions

Sent from my iPhone

On Jul 2, 2012, at 3:28 PM, "Park, Jee [USA]" <[hidden email]> wrote:

> Hi, I used sudo chmod a-x /var/zookeeper, and still am getting permission
> denied
> How do I make sure /var/zookeeper is writable?
>
> -----Original Message-----
> From: William Slacum [mailto:[hidden email]]
> Sent: Monday, July 02, 2012 1:45 PM
> To: [hidden email]
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Make sure that /var/zookeeper is writable by the user you're launching
> Zookeeper as. Alternatively, you can reconfigure zookeeper's zoo.cfg file to
> change the directory to somewhere that is writable.
>
> On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <[hidden email]> wrote:
>
>> Ah, so I realized I wasn't running hadoop or zookeeper, and so I am
>> running hadoop, but cannot get zookeeper to run Here is what I did:
>>
>> $ $ZOOKEEPER_HOME/bin/zkServer.sh start JMX enabled by default Using
>> config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
>> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
>> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
>> /var/zookeeper/zookeeper_server.pid: Permission denied FAILED TO WRITE
>> PID
>>
>>
>> -----Original Message-----
>> From: Jim Klucar [mailto:[hidden email]]
>> Sent: Monday, July 02, 2012 1:25 PM
>> To: [hidden email]
>> Subject: Re: [External] Re: Need help getting Accumulo running.
>>
>> Did you verify that zookeeper is running?
>>
>> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <[hidden email]> wrote:
>>> Thanks everyone for the responses!
>>>
>>> So, I got hadoop to run and installed accumulo following Miguel's
>>> email, the problem now is that when I do
>>>
>>> $ bin/accumulo init
>>>
>>> It tries to connect a few times and then times out. Here is what it
>>> prints out.
>>> Just to let you know I did not change anything in the
>>> accumulo-site.xml file
>>>
>>> Thanks,
>>> Jee
>>>
>>> hduser@ubuntu:~/accumulo$ bin/accumulo init
>>> 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 0 time(s).
>>> 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 1 time(s).
>>> 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 2 time(s).
>>> 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 3 time(s).
>>> 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 4 time(s).
>>> 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 5 time(s).
>>> 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 6 time(s).
>>> 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 7 time(s).
>>> 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 8 time(s).
>>> 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 9 time(s).
>>> 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
>>> Call to
>>> localhost/127.0.0.1:54310 failed on connection exception:
>>> java.net.ConnectException: Connection refused
>>> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed
>>> on connection exception: java.net.ConnectException: Connection
>>> refused at
>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>>> at $Proxy0.getProtocolVersion(Unknown Source) at
>>> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>>> at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>>> 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>>> at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>>> il
>>> eSyste
>>> m.java:82)
>>> at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>>> 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>>> at
>>> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>>> 54
>>> ) at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>>> .j
>>> ava:57
>>> )
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>>> ss
>>> orImpl
>>> .java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:601)
>>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>>> at java.lang.Thread.run(Thread.java:722)
>>> Caused by: java.net.ConnectException: Connection refused at
>>> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>>> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>>> 1)
>>> at
>>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>>> java:2
>>> 06)
>>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>>> at
>>> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>>> 04
>>> ) at
>>> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>>> ... 20 more
>>> Thread "init" died null
>>> java.lang.reflect.InvocationTargetException
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>>> .j
>>> ava:57
>>> )
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>>> ss
>>> orImpl
>>> .java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:601)
>>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>>> at java.lang.Thread.run(Thread.java:722)
>>> Caused by: java.lang.RuntimeException: java.net.ConnectException:
>>> Call to
>>> localhost/127.0.0.1:54310 failed on connection exception:
>>> java.net.ConnectException: Connection refused at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
>>> ... 6 more
>>> Caused by: java.net.ConnectException: Call to
>>> localhost/127.0.0.1:54310 failed on connection exception:
>>> java.net.ConnectException: Connection refused at
>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>>> at $Proxy0.getProtocolVersion(Unknown Source) at
>>> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>>> at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>>> 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>>> at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>>> il
>>> eSyste
>>> m.java:82)
>>> at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>>> 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>>> at
>>> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>>> 54
>>> ) at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>>> ... 6 more
>>> Caused by: java.net.ConnectException: Connection refused at
>>> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>>> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>>> 1)
>>> at
>>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>>> java:2
>>> 06)
>>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>>> at
>>> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>>> 04
>>> ) at
>>> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>>> ... 20 more
>>>
>>> -----Original Message-----
>>> From: Miguel Pereira [mailto:[hidden email]]
>>> Sent: Friday, June 29, 2012 2:59 PM
>>> To: [hidden email]
>>> Subject: [External] Re: Need help getting Accumulo running.
>>>
>>> Hi Jee,
>>>
>>> I used that same guide to install Accumulo, but I used this guide to
>>> install hadoop.
>>>
>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux
>>> -s
>>> ingle-
>>> node-cluster/
>>>
>>> furthermore here are the steps I took to install accumulo were I
>>> used version 1.4.0 and standalone conf.
>>> please note you also need to install java jdk, and set your
>>> JAVA_HOME i used jdk 1.7
>>>
>>> Setting up Accumulo
>>>
>>>
>>>   - git clone     git://github.com/apache/accumulo.git
>>>   - cd accumulo
>>>   - git checkout     tags/1.4.0 -b 1.4.0
>>>   - mvn package && mvn assembly:single -N.             // this can
>>> take a
>>>   while
>>>   - cp conf/examples/512MB/standalone/* conf
>>>   - vi accumulo-env.sh
>>>
>>>
>>> test -z "$JAVA_HOME" && export
>>> JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>>> test -z "$HADOOP_HOME" && export
>>> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>>> test -z "$ZOOKEEPER_HOME" && export
>>> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>>>
>>>   - vi     accumulo-site.xml
>>>
>>>
>>>    modify user, password, secret, memory
>>>
>>>
>>>   - bin/accumulo     init
>>>   - bin/start-all.sh
>>>   - bin/accumulo     shell -u root
>>>
>>> if you get the shell up you know your good.
>>>
>>>
>>> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]>
>> wrote:
>>>
>>>> We currently don't really support running on Windows. I'm sure
>>>> there are ways to get it running with Cygwin, but our efforts are
>>>> better spend in other directions for now.
>>>>
>>>> As for getting it going in Ubuntu, I haven't seen that guide before.
>>>> Can you let me know where it broke?
>>>>
>>>> For the record, when I was developing ACCUMULO-404, I was working
>>>> in Ubuntu VMs and I used Apache-BigTop and our debians to
>>>> facilitate
>>> installation.
>>>> They don't do everything for you, but I think if you use 1.4.1 (not
>>>> sure if I got the debs into 1..4.0), it should diminish the
>>>> installation work you must do to some minor configuration.
>>>>
>>>> John
>>>>
>>>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
>> wrote:
>>>>
>>>>> Hi, ****
>>>>>
>>>>> ** **
>>>>>
>>>>> I had trouble getting Accumulo to work on a VM instance of Ubuntu
>>>>> (11.04) using this guide: https://gist.github.com/1535657.****
>>>>>
>>>>> Does anyone have a step-by-step guide to get it running on either
>>>>> Ubuntu or Windows 7?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Thanks!****
>>>>>
>>>>
>>
Reply | Threaded
Open this post in threaded view
|

RE: [External] Re: Need help getting Accumulo running.

Park, Jee [USA]
In reply to this post by David Medinets
Hello,
I currently have hadoop, zookeeper and accumulo running, however I keep
getting the following error when trying to start the accumulo shell:

~$: accumulo/bin/accumulo shell -u root
Enter current password for 'root'@'accumulo': ********
05 18:18:26,233 [shell.Shell] ERROR:
org.apache.accumulo.core.client.AccumuloSecurityException: Error
BAD_CREDENTIALS - Username or Password is Invalid

Thanks in advance.
-----Original Message-----
From: David Medinets [mailto:[hidden email]]
Sent: Friday, June 29, 2012 7:37 PM
To: [hidden email]
Subject: [External] Re: Need help getting Accumulo running.

oh... I think you missed a few steps from the gist:

$ cd ~
$ export TAR_DIR=~/workspace/accumulo/src/assemble/target
$ tar xvzf $TAR_DIR/accumulo-1.5.0-incubating-SNAPSHOT-dist.tar.gz

# Add the following to your .bashrc file.
$ export ACCUMULO_HOME=~/accumulo-1.5.0-incubating-SNAPSHOT

$ cd $ACCUMULO_HOME/conf

These are the steps where you unpack the newly-created gz file into your
home directory. It seems like you are running Accumulo from the source code
directory. Also notice that I wrote those steps for v1.5.0 which might be
different from v1.4.0

On Fri, Jun 29, 2012 at 2:58 PM, Miguel Pereira <[hidden email]>
wrote:

> Hi Jee,
>
> I used that same guide to install Accumulo, but I used this guide to
> install hadoop.
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-s
> ingle-node-cluster/
>
> furthermore here are the steps I took to install accumulo were I used
> version 1.4.0 and standalone conf.
> please note you also need to install java jdk, and set your JAVA_HOME
> i used jdk 1.7
>
> Setting up Accumulo
>
>
>   - git clone     git://github.com/apache/accumulo.git
>   - cd accumulo
>   - git checkout     tags/1.4.0 -b 1.4.0
>   - mvn package && mvn assembly:single -N.             // this can
> take a
>   while
>   - cp conf/examples/512MB/standalone/* conf
>   - vi accumulo-env.sh
>
>
> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
> test -z "$HADOOP_HOME" && export
> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
> test -z "$ZOOKEEPER_HOME" && export
> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>
>   - vi     accumulo-site.xml
>
>
>    modify user, password, secret, memory
>
>
>   - bin/accumulo     init
>   - bin/start-all.sh
>   - bin/accumulo     shell -u root
>
> if you get the shell up you know your good.
>
>
> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:
>
>> We currently don't really support running on Windows. I'm sure there
>> are ways to get it running with Cygwin, but our efforts are better
>> spend in other directions for now.
>>
>> As for getting it going in Ubuntu, I haven't seen that guide before.
>> Can you let me know where it broke?
>>
>> For the record, when I was developing ACCUMULO-404, I was working in
>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
installation.
>> They don't do everything for you, but I think if you use 1.4.1 (not
>> sure if I got the debs into 1..4.0), it should diminish the
>> installation work you must do to some minor configuration.
>>
>> John
>>
>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
wrote:

>>
>> > Hi, ****
>> >
>> > ** **
>> >
>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>> > (11.04) using this guide: https://gist.github.com/1535657.****
>> >
>> > Does anyone have a step-by-step guide to get it running on either
>> > Ubuntu or Windows 7?****
>> >
>> > ** **
>> >
>> > Thanks!****
>> >
>>

smime.p7s (11K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: [External] Re: Need help getting Accumulo running.

Jim Klucar
It is asking for the password you setup for the accumulo root user,
not the machine root user. It's whatever you typed in when you ran the
$ACCUMULO_HOME/bin/accumulo init   command.

On Thu, Jul 5, 2012 at 9:21 PM, Park, Jee [USA] <[hidden email]> wrote:

> Hello,
> I currently have hadoop, zookeeper and accumulo running, however I keep
> getting the following error when trying to start the accumulo shell:
>
> ~$: accumulo/bin/accumulo shell -u root
> Enter current password for 'root'@'accumulo': ********
> 05 18:18:26,233 [shell.Shell] ERROR:
> org.apache.accumulo.core.client.AccumuloSecurityException: Error
> BAD_CREDENTIALS - Username or Password is Invalid
>
> Thanks in advance.
> -----Original Message-----
> From: David Medinets [mailto:[hidden email]]
> Sent: Friday, June 29, 2012 7:37 PM
> To: [hidden email]
> Subject: [External] Re: Need help getting Accumulo running.
>
> oh... I think you missed a few steps from the gist:
>
> $ cd ~
> $ export TAR_DIR=~/workspace/accumulo/src/assemble/target
> $ tar xvzf $TAR_DIR/accumulo-1.5.0-incubating-SNAPSHOT-dist.tar.gz
>
> # Add the following to your .bashrc file.
> $ export ACCUMULO_HOME=~/accumulo-1.5.0-incubating-SNAPSHOT
>
> $ cd $ACCUMULO_HOME/conf
>
> These are the steps where you unpack the newly-created gz file into your
> home directory. It seems like you are running Accumulo from the source code
> directory. Also notice that I wrote those steps for v1.5.0 which might be
> different from v1.4.0
>
> On Fri, Jun 29, 2012 at 2:58 PM, Miguel Pereira <[hidden email]>
> wrote:
>> Hi Jee,
>>
>> I used that same guide to install Accumulo, but I used this guide to
>> install hadoop.
>>
>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-s
>> ingle-node-cluster/
>>
>> furthermore here are the steps I took to install accumulo were I used
>> version 1.4.0 and standalone conf.
>> please note you also need to install java jdk, and set your JAVA_HOME
>> i used jdk 1.7
>>
>> Setting up Accumulo
>>
>>
>>   - git clone     git://github.com/apache/accumulo.git
>>   - cd accumulo
>>   - git checkout     tags/1.4.0 -b 1.4.0
>>   - mvn package && mvn assembly:single -N.             // this can
>> take a
>>   while
>>   - cp conf/examples/512MB/standalone/* conf
>>   - vi accumulo-env.sh
>>
>>
>> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>> test -z "$HADOOP_HOME" && export
>> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>> test -z "$ZOOKEEPER_HOME" && export
>> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>>
>>   - vi     accumulo-site.xml
>>
>>
>>    modify user, password, secret, memory
>>
>>
>>   - bin/accumulo     init
>>   - bin/start-all.sh
>>   - bin/accumulo     shell -u root
>>
>> if you get the shell up you know your good.
>>
>>
>> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:
>>
>>> We currently don't really support running on Windows. I'm sure there
>>> are ways to get it running with Cygwin, but our efforts are better
>>> spend in other directions for now.
>>>
>>> As for getting it going in Ubuntu, I haven't seen that guide before.
>>> Can you let me know where it broke?
>>>
>>> For the record, when I was developing ACCUMULO-404, I was working in
>>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> installation.
>>> They don't do everything for you, but I think if you use 1.4.1 (not
>>> sure if I got the debs into 1..4.0), it should diminish the
>>> installation work you must do to some minor configuration.
>>>
>>> John
>>>
>>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
> wrote:
>>>
>>> > Hi, ****
>>> >
>>> > ** **
>>> >
>>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>>> > (11.04) using this guide: https://gist.github.com/1535657.****
>>> >
>>> > Does anyone have a step-by-step guide to get it running on either
>>> > Ubuntu or Windows 7?****
>>> >
>>> > ** **
>>> >
>>> > Thanks!****
>>> >
>>>
Reply | Threaded
Open this post in threaded view
|

RE: [External] Re: Need help getting Accumulo running.

Park, Jee [USA]
That's exactly what I typed in for the password, however it is still giving me
that error. Also, if it makes any difference, I have not changed anything in
the accumulo-site.xml file.

-----Original Message-----
From: Jim Klucar [mailto:[hidden email]]
Sent: Thursday, July 05, 2012 9:25 PM
To: [hidden email]
Subject: Re: [External] Re: Need help getting Accumulo running.

It is asking for the password you setup for the accumulo root user, not the
machine root user. It's whatever you typed in when you ran the
$ACCUMULO_HOME/bin/accumulo init   command.

On Thu, Jul 5, 2012 at 9:21 PM, Park, Jee [USA] <[hidden email]> wrote:

> Hello,
> I currently have hadoop, zookeeper and accumulo running, however I
> keep getting the following error when trying to start the accumulo shell:
>
> ~$: accumulo/bin/accumulo shell -u root Enter current password for
> 'root'@'accumulo': ********
> 05 18:18:26,233 [shell.Shell] ERROR:
> org.apache.accumulo.core.client.AccumuloSecurityException: Error
> BAD_CREDENTIALS - Username or Password is Invalid
>
> Thanks in advance.
> -----Original Message-----
> From: David Medinets [mailto:[hidden email]]
> Sent: Friday, June 29, 2012 7:37 PM
> To: [hidden email]
> Subject: [External] Re: Need help getting Accumulo running.
>
> oh... I think you missed a few steps from the gist:
>
> $ cd ~
> $ export TAR_DIR=~/workspace/accumulo/src/assemble/target
> $ tar xvzf $TAR_DIR/accumulo-1.5.0-incubating-SNAPSHOT-dist.tar.gz
>
> # Add the following to your .bashrc file.
> $ export ACCUMULO_HOME=~/accumulo-1.5.0-incubating-SNAPSHOT
>
> $ cd $ACCUMULO_HOME/conf
>
> These are the steps where you unpack the newly-created gz file into
> your home directory. It seems like you are running Accumulo from the
> source code directory. Also notice that I wrote those steps for v1.5.0
> which might be different from v1.4.0
>
> On Fri, Jun 29, 2012 at 2:58 PM, Miguel Pereira
> <[hidden email]>
> wrote:
>> Hi Jee,
>>
>> I used that same guide to install Accumulo, but I used this guide to
>> install hadoop.
>>
>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-
>> s
>> ingle-node-cluster/
>>
>> furthermore here are the steps I took to install accumulo were I used
>> version 1.4.0 and standalone conf.
>> please note you also need to install java jdk, and set your JAVA_HOME
>> i used jdk 1.7
>>
>> Setting up Accumulo
>>
>>
>>   - git clone     git://github.com/apache/accumulo.git
>>   - cd accumulo
>>   - git checkout     tags/1.4.0 -b 1.4.0
>>   - mvn package && mvn assembly:single -N.             // this can
>> take a
>>   while
>>   - cp conf/examples/512MB/standalone/* conf
>>   - vi accumulo-env.sh
>>
>>
>> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>> test -z "$HADOOP_HOME" && export
>> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>> test -z "$ZOOKEEPER_HOME" && export
>> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>>
>>   - vi     accumulo-site.xml
>>
>>
>>    modify user, password, secret, memory
>>
>>
>>   - bin/accumulo     init
>>   - bin/start-all.sh
>>   - bin/accumulo     shell -u root
>>
>> if you get the shell up you know your good.
>>
>>
>> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:
>>
>>> We currently don't really support running on Windows. I'm sure there
>>> are ways to get it running with Cygwin, but our efforts are better
>>> spend in other directions for now.
>>>
>>> As for getting it going in Ubuntu, I haven't seen that guide before.
>>> Can you let me know where it broke?
>>>
>>> For the record, when I was developing ACCUMULO-404, I was working in
>>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
> installation.
>>> They don't do everything for you, but I think if you use 1.4.1 (not
>>> sure if I got the debs into 1..4.0), it should diminish the
>>> installation work you must do to some minor configuration.
>>>
>>> John
>>>
>>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
> wrote:
>>>
>>> > Hi, ****
>>> >
>>> > ** **
>>> >
>>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>>> > (11.04) using this guide: https://gist.github.com/1535657.****
>>> >
>>> > Does anyone have a step-by-step guide to get it running on either
>>> > Ubuntu or Windows 7?****
>>> >
>>> > ** **
>>> >
>>> > Thanks!****
>>> >
>>>

smime.p7s (11K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: [External] Re: Need help getting Accumulo running.

Jim Klucar
Can you see the monitor page at http://localhost:50095 ?


On Thu, Jul 5, 2012 at 9:26 PM, Park, Jee [USA] <[hidden email]> wrote:

> That's exactly what I typed in for the password, however it is still giving me
> that error. Also, if it makes any difference, I have not changed anything in
> the accumulo-site.xml file.
>
> -----Original Message-----
> From: Jim Klucar [mailto:[hidden email]]
> Sent: Thursday, July 05, 2012 9:25 PM
> To: [hidden email]
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> It is asking for the password you setup for the accumulo root user, not the
> machine root user. It's whatever you typed in when you ran the
> $ACCUMULO_HOME/bin/accumulo init   command.
>
> On Thu, Jul 5, 2012 at 9:21 PM, Park, Jee [USA] <[hidden email]> wrote:
>> Hello,
>> I currently have hadoop, zookeeper and accumulo running, however I
>> keep getting the following error when trying to start the accumulo shell:
>>
>> ~$: accumulo/bin/accumulo shell -u root Enter current password for
>> 'root'@'accumulo': ********
>> 05 18:18:26,233 [shell.Shell] ERROR:
>> org.apache.accumulo.core.client.AccumuloSecurityException: Error
>> BAD_CREDENTIALS - Username or Password is Invalid
>>
>> Thanks in advance.
>> -----Original Message-----
>> From: David Medinets [mailto:[hidden email]]
>> Sent: Friday, June 29, 2012 7:37 PM
>> To: [hidden email]
>> Subject: [External] Re: Need help getting Accumulo running.
>>
>> oh... I think you missed a few steps from the gist:
>>
>> $ cd ~
>> $ export TAR_DIR=~/workspace/accumulo/src/assemble/target
>> $ tar xvzf $TAR_DIR/accumulo-1.5.0-incubating-SNAPSHOT-dist.tar.gz
>>
>> # Add the following to your .bashrc file.
>> $ export ACCUMULO_HOME=~/accumulo-1.5.0-incubating-SNAPSHOT
>>
>> $ cd $ACCUMULO_HOME/conf
>>
>> These are the steps where you unpack the newly-created gz file into
>> your home directory. It seems like you are running Accumulo from the
>> source code directory. Also notice that I wrote those steps for v1.5.0
>> which might be different from v1.4.0
>>
>> On Fri, Jun 29, 2012 at 2:58 PM, Miguel Pereira
>> <[hidden email]>
>> wrote:
>>> Hi Jee,
>>>
>>> I used that same guide to install Accumulo, but I used this guide to
>>> install hadoop.
>>>
>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-
>>> s
>>> ingle-node-cluster/
>>>
>>> furthermore here are the steps I took to install accumulo were I used
>>> version 1.4.0 and standalone conf.
>>> please note you also need to install java jdk, and set your JAVA_HOME
>>> i used jdk 1.7
>>>
>>> Setting up Accumulo
>>>
>>>
>>>   - git clone     git://github.com/apache/accumulo.git
>>>   - cd accumulo
>>>   - git checkout     tags/1.4.0 -b 1.4.0
>>>   - mvn package && mvn assembly:single -N.             // this can
>>> take a
>>>   while
>>>   - cp conf/examples/512MB/standalone/* conf
>>>   - vi accumulo-env.sh
>>>
>>>
>>> test -z "$JAVA_HOME" && export JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>>> test -z "$HADOOP_HOME" && export
>>> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>>> test -z "$ZOOKEEPER_HOME" && export
>>> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>>>
>>>   - vi     accumulo-site.xml
>>>
>>>
>>>    modify user, password, secret, memory
>>>
>>>
>>>   - bin/accumulo     init
>>>   - bin/start-all.sh
>>>   - bin/accumulo     shell -u root
>>>
>>> if you get the shell up you know your good.
>>>
>>>
>>> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <[hidden email]> wrote:
>>>
>>>> We currently don't really support running on Windows. I'm sure there
>>>> are ways to get it running with Cygwin, but our efforts are better
>>>> spend in other directions for now.
>>>>
>>>> As for getting it going in Ubuntu, I haven't seen that guide before.
>>>> Can you let me know where it broke?
>>>>
>>>> For the record, when I was developing ACCUMULO-404, I was working in
>>>> Ubuntu VMs and I used Apache-BigTop and our debians to facilitate
>> installation.
>>>> They don't do everything for you, but I think if you use 1.4.1 (not
>>>> sure if I got the debs into 1..4.0), it should diminish the
>>>> installation work you must do to some minor configuration.
>>>>
>>>> John
>>>>
>>>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <[hidden email]>
>> wrote:
>>>>
>>>> > Hi, ****
>>>> >
>>>> > ** **
>>>> >
>>>> > I had trouble getting Accumulo to work on a VM instance of Ubuntu
>>>> > (11.04) using this guide: https://gist.github.com/1535657.****
>>>> >
>>>> > Does anyone have a step-by-step guide to get it running on either
>>>> > Ubuntu or Windows 7?****
>>>> >
>>>> > ** **
>>>> >
>>>> > Thanks!****
>>>> >
>>>>
12