Archive for the ‘How To’ Category

FTP Client Using Apache Common net library using Socks Proxy Server

August 2, 2012 1 comment

As we have the FTP Server code in previous post, here is a FTP Client using Apache Commons library.  Below is a basic code which we can use to connect to our FTP server which we created over a SOCKS Proxy Server. (Assuming you have a SOCKS proxy server running already – if not then code can be used just as a FTP Client to connect directly to FTP Server).

Below Image shows, libraries required and parameter passed from the build.xml (this can be done over the command line as well)


Below Image shows changes required to Route all Packets to go through Proxy Server



Libraries Required.

<path id="FtpClientDemo.classpath">
        <pathelement location="bin"/>
        <pathelement location="lib/commons-net-3.1-ftp.jar"/>
        <pathelement location="lib/commons-net-3.1.jar"/>

Sample Arg in Build.xml File

<target name="FtpClientDemo">
    <java classname="com.ftp.client.FtpClient" failonerror="true" fork="yes">
        <jvmarg line=""/>
        <!-- Proxy Argument Below -->
        <!-- <arg line="-PrH -p true -b -s sipl sipl LookAppCode.rar f:\LookAppCode.rar"/> -->
        <!-- Normal Connection to FTPServer without Proxy -->
        <arg line="-l ahmed ahmed"/>
        <classpath refid="FtpClientDemo.classpath"/>

Read more…

FTP Server using Apache FTPServer Library

August 2, 2012 5 comments

Was working on getting a FTP Server.

Creating a FTP server using Apache Library is very simple.


As you can see in the image you will need some libs, and all the libraries can be found here :

Below is the code to get started. You will also get all the properties file from the above link.

Content for

log4j.rootLogger=DEBUG, C
log4j.appender.C.layout.ConversionPattern=[%5p] %d [%X{userName}] [%X{remoteIp}] %m%n

Content for

# Password is "admin"


Code below – can be found in the FTPServer Source Examples – Modified a little for my need.

package com.ftp.server;
import org.apache.ftpserver.FtpServer;
import org.apache.ftpserver.FtpServerFactory;
import org.apache.ftpserver.ftplet.User;
import org.apache.ftpserver.ftplet.UserManager;
import org.apache.ftpserver.listener.ListenerFactory;
import org.apache.ftpserver.main.AddUser;
import org.apache.ftpserver.ssl.SslConfigurationFactory;
import org.apache.ftpserver.usermanager.ClearTextPasswordEncryptor;
import org.apache.ftpserver.usermanager.PropertiesUserManagerFactory;
import org.apache.ftpserver.usermanager.SaltedPasswordEncryptor;
import org.apache.ftpserver.usermanager.UserFactory;
import org.apache.ftpserver.usermanager.impl.BaseUser;
import org.apache.log4j.PropertyConfigurator;
import org.slf4j.impl.Log4jLoggerFactory;

public class EmbeddingFtpServer {

public static void main(String[] args) throws Exception {

FtpServerFactory serverFactory = new FtpServerFactory();
ListenerFactory factory = new ListenerFactory();

// set the port of the listener

// replace the default listener
serverFactory.addListener("default", factory.createListener());

System.out.println("Adding Users Now");
PropertiesUserManagerFactory userManagerFactory = new PropertiesUserManagerFactory();
userManagerFactory.setFile(new File(""));

userManagerFactory.setPasswordEncryptor(new SaltedPasswordEncryptor());
UserManager userManagement = userManagerFactory.createUserManager();
UserFactory userFact = new UserFactory();
User user = userFact.createUser();;


// start the server
FtpServer server = serverFactory.createServer();

System.out.println("Server Starting" + factory.getPort());

Once done ,server will be running on port 8080 (factory.setPort(8080); ) and “F:/” (userFact.setHomeDirectory("F:/"); ) directory will be the root Directory. Connect to server using any FTPClient like Filezilla.

NOTE: SSL part is not working, will updated once done.

Categories: BigData, How To Tags: ,

Getting Data from RestFB and Creating Sequence File > Hadoop

July 16, 2012 Leave a comment

Here is a quick code to get data from Facebook using RestFB API and  create Sequence file and dump you data into Hadoop Cluster.


  • Hadoop 1.0.3 Installed as Stand Alone or Multinode.
  • Eclipse IDE for development
  • Hadoop and Apache commons jars.
  • RestFB APIs

Steps to Create Eclipse Project.

  • New Java Project.
  • Add the jar to the project. (Apache Commons and hadoop-core.1.0.3.jar) and add RestFB jar.
  • You will find all (commons and hadoop) jars under hadoop directory.

Sequence File Content Format.

  • Key – <facebook_id, facebook_name, timestamp>
  • Value – <batch_me, batch_me_friends, batch_me_likes>

Add the below code to get DATA from Facebook and generate Sequence File. Before you start you need to updated the AccessToken in the code with yours Access Token from Facebook. You might want to look here..

Read more…

Getting Batch Data from Facebook using restFB APIs

July 14, 2012 Leave a comment

Here is quick sample code to get data from Facebook Batch API.

Download the jar from here –

And put it in your library path and execute the below code.

Go to this link and login to facebook to get your access token :

Change the code to pass your “AccessToken” directly to “DefaultFacebookClient facebookClient = new DefaultFacebookClient(“<<<ACCESSTOKEN HERE>>>”);

Read more…

Installing Hadoop 1.0.3 on Ubuntu Single Node Cluster using shell script

July 14, 2012 Leave a comment

I was working on setting up Hadoop on Ubuntu as a Single node cluster.

I came across a very nice blog about it here. (Must read to setup your single node cluster).
While I was at it, I was creating / Installing Hadoop multiple time in different system, then I though to create a script of my own, based on the blog above.

Here is the link to my script which is on GITHUB anyone interested can check-out and enhance the script.



1. Hadoop 1.0.3

2. Ubuntu 10.04  or above (Tested on 11.04, 11.10 and 12.04 32bit platform)

Here is the details on how to install Hadoop using the script

Please Readme

- hadoop script to setup Single Node Cluster - For Hadoop 1.0.3 Only.

- Tested on Ubuntu 11.10, 12.04 - Fresh Install.

- Scripts assumes nothing is installed for Hadoop and installs Required Components for Hadoop to run.

- This Script was created using the Installation Guide by Micheal Noll.


Steps For Executing Script: Currently script only takes single option at a time :(


Execute Help

]$ sudo ./ --help


usage: ./ <single-parameter>


  Optional parameters:

     --install-init, -i Initialization script To Install Hadoop as Single Node Cluster.

     Use the below Options, Once you are logged-in as Hadoop User 'hduser' created in the -i init script above.

     --install-ssh, -s Install ssh-keygen -t rsa -P

     --install-bashrc, -b Updated '.bashrc' with JAVA_HOME, HADOOP_HOME.

     --ipv6-disable, -v IPv6 Support Disable.[ Might Not be required.

                                Updating 'conf/' with

'' option in -e]

     --hostname-update, -u Update Hostname for the system.

     --config-update, -c Update Configuration with default values

(Single Node) in core-site.xml, mapred-site.xml, hdfs-site.xml.

     --update-hadoop-env, -e Update Hadoop Env Script with JAVA_HOME.

     --help, -h Display this Message.


Read more…

Configure Master / Slave Replication MySQL XAMPP

April 24, 2012 4 comments

Mysql Master Server Configuration

First Lets go to Replication panel on XAMPP.

Screenshot at 2012-04-24 02_13_51

Read more…

Installing OpenNMS on CentOS – using RPM.

April 13, 2012 Leave a comment
As from my previous post. This again is for the IT Team. Was helping them setup and network Monitoring tool.

Here are the step to follow to install OpenNMS.

Step 1: configure theOpenNMS Repository RPM
]# yum install yum-fastestmirror
Categories: How To