Skip to main content

Posts

Linux CSV Command Line Tool XSV

Linux CSV Command Line Tool XSVlast updated: August 8th,2020 by AadiIn this post, I will introduce an excellent command line tool XSV for the CSV file parsing and data manipulation. You can install the tool from github https://github.com/BurntSushi/xsv
Let us begin the tutorial.
If you are on Centos, following installation commands will work.
yum install cargocargo install xsv
By default cargo installs it in /root/.cargo/bin/xsv
If you are not root, you might have to set either alias or add the tool in your bash PATH.
alias xsv=/root/.cargo/bin/xsvAs an example, I would look at the stocks data which I downloaded from Yahoo Finance for the Apple stock.
Ok let us first look at our data.

Read Csv file using Xsv
Ok, now let us try using the Xsv command "xsv table". Xsv table would show the data in nice tabular form. Instead of Linux 'head -2' command, we can also use Xsv slice command
Recent posts

How To Install R and R Studio Server On Centos

How To Install R and R Studio Server On Centos R is extensively used for data processing and analyzing. R has gained lot of popularity over the last few years because of data explosion over the mobile and web applications.

To leverage the power of R and its eco system, one needs to have complete R suite of tools installed. Although there is large community of R developers and system administrators, I couldn't find a good resource where I could find everything about installing R and its tools in simple easy steps. That's why I decided to write this post. In this post, I will talk about installing following...
RR Studio ServerR Studio Connect Install R Please run following two commands to install R.

sudo yum install epel-release sudo yum install R
Type R -verson in your bash shell. You should see following output depending upon what version of R you have.

To Bring up the R repl. Just type R and you should have your R shell started. To install any package in R, just do install.pack…

Read Fiction Books Online for Free - FreeFictionBooks

freefictionbooks is a collection of free fiction and non-fiction books that can be read online. All books on this blog are published legally. They may be in public domain, because their copyright expired (public domain) or because their authors gave up their rights, or their authors provided explicit permission to freefictionbooks to publish them on this blog. Ebooks are available in pdf format for easier reading and bookmarking. We are adding new ebooks daily.

Some popular ebooks from our collection:

The Old Woman Who Lived in a Shoe by Douglas
The Works of Samuel Johnson, LL.D., in Nine Volume
A Woman's Impression of the Philippines by Fee
The College, the Market, and the Court by Dall
The Quiver 3/1900 by Anonymous
Bach by Charles Francis Abdy Williams
Irish Witchcraft and Demonology by Seymour
Over the Front in an Aeroplane and Scenes Inside t
Simpson's Chelsea, Pimlico, Brompton, and Knightsb
The Quiver, 2/1900
Female Warriors, Vol. II (of 2) by Ellen C. Clayton
Female Warriors, Vol…

How To Crawl Coupon Sites With Python

How To Crawl Coupon Sites With Python In this post, i will show you how to use Python and LXML to crawl coupons and deals from couponsites. The purpose of this post is to help users write crawlers with Python.
To demo this, I will crawl coupons from couponannie.com and couponmonk.us.
Example 1 Let us start with couponannie.com first.
Let us first import the following two libraries..
import requests import lxml.html Most of the coupon sites have thousands of coupon pages. Most of the times, these pages are per company or site. These pages are structured templates. Therefore when we write a crawler for one coupon page, then it should work for all the coupon pages. In the case of couponannie also, this is the case.
Let us pick the following url couponannie.com/stores/linkfool and extract the coupons and its related information.
url = 'https://www.couponannie.com/stores/linkfool' We will use requests to get the content of above page as shown below.
obj = requests.get(url) Let us co…