Skip to main content

Running DHCPerf on Ubuntu

DHCPerf is an open source load testing tool for DHCP-Server setups. The tool is capable of generating DHCP Client traffic with a customizable load. It comes in handy in stress testing scenarios.

I have tested it to work with Ubuntu 12.0 and Nomium's DHCPerf 1.0.

Steps:

  • Download the  Red Hat Linux Enterprise Server ES3 (x86) version of DHCPPerf from here: http://www.nominum.com/support/measurement-tools/

  • We need the Alien tool to convert the download RPM format file to debian package to be installed in Ubuntu.

  • Download Alien tool by running:


$sudo apt-get install alien




  • Convert the downloaded RPM file to Debian package format through Alien using the following command


       $sudo alien -k DHCPPerf1.0.1.0.rpm


  • The Converted Debian package would be placed in the same folder. Install it using the following command:


$sudo dpkg -i DHCPPerf1.0.1.0.deb




  • The DHCPerf, by default, gets installed into /usr/local/nom/...

  • To Run, navigate into the bin folder under nom and run the DHCPerf command:


./dhcperf  --help


to see the usage of the command.


Run WireShark and set the filter to bootp to see the DHCP packets on the wire.

Comments

Post a Comment

Popular posts from this blog

GraphQL

GraphQL What is GraphQL It is a specification laid out by Facebook which proposed an alternative way to query and modify data. Think of it is an as a complimentary of REST/RPC. Now head here and read the  original graphQL documentation . It will take 2-3 hours tops but is a worthy read. This will help you build some impressions on it and help contrast against mine below: Why use GraphQL Core advantage Instead of defining custom backend rpc/rest endpoints for every data-shape, graphql allows you to build a more general endpoint which give frontend/mobile engineers freedom and query and play with the data. It might be less efficient, add a bit more complexity (need for things like  data-loader ), harder to standardize and control client-contracts for. What it looses in complexity and control, it gains in flexibility and freedom - provided your data model is worth of a graphql-ish query  How to tell if my data-model graphql-ish? Are there complex relationships between y

About me

Hi, I'm currently working as a software developer in the logistics industry in San Francisco.  My aim is to impact billions of people's live for the better and make the world a better place. Cheers, Vignesh

Backend - Tech refresher 2019

Hello there As a software engineer, it is important to keep updating your skillsets by learning the latest programming-elements (includes  paradigms,  patterns,  languages,  tools and  frameworks ). This becomes a bit easy if you already working on the cutting edge of something. Even then, it is possible to go too deep and loose breadth. I've taken upon myself to do a tech refresher every year. The intent is to read, experiment and understand these elements by spending anywhere between 4 days to 4 weeks. The ultimate goal is: "do I know most  that I need to know to build a planet-scale backend tech-stack ground up" I'll write up my learnings in posts to help myself (and maybe others) refer it. Here is the initial list I'm thinking about: Redis MySQL, GraphQL Aurora, Mesos, Kubernetes Cadence, SWS Cassandra, MangoDB, NoSQL, MySQL, Spanner, S<, DynDB ELK Flink, Storm, Samza, Spark Hadoop HDFS, Yarn, MapReduce Hive, HBase Kafka, Zookeeper NW: