Been a while since I blogged. There are a lot of things in the works but they involve a fair bit of scripting, thus I’m spending more time smoothing out the rough edges before I blog them. Projects include
new 10046 parser aimed at analyizing I/O specifically
IO/ZFS/NFS monitor for NFS filers running on top of ZFS
TCP dump analyzer for NFS traffic speeds
wrapper for fio to run a database centric set of I/O benchmarks and formate the output nicely
orastat – a vmstat like tool for monitoring Oracle I/O latency and IOPs
moats.sql exention to include I/O histograms. Might have to release this as a separate tool if Adrian and Tanel don’t incorporate the changes
Analysis of the impact of filesytemio_options on non-buffer cache I/O
But in the mean time, methods of visualizing quantitative data are consistently of interest to me. Excel, Tableau and R have all been recent tools that I’ve used. R recently came to the top of the pile as the only tool of the 3 that could properly graph a 3 dimensional graph. R is on the list of exciting tools to spend more time with. At a recent D3 meetup in SF, the majority of attendees used R to do mockups before coding the visualizations in D3. On that note, what are the tools other people are using. Here is a good list from the Eyeo conference:
Hand-written calculations and paper highlights (Stefanie Posavec)
Java (Martin Wattenberg and Fernanda Viegas)
Processing and MySQL, pdf exported into Illustrator and InDesign (Felton)
Tableau and Excel and Gephi and Processing and Photoshop (Moritz Stefaner)
R (Amanda Cox) [I know this from the Chartsnthings blog and a previous talk; although the NYT Graphics team uses many other visual design and development tools as a group]
Processing (Ben Fry, Wes Grubbs, Jer Thorp)
Dat.GUI (Koblin)
of the list above, check out the blog from Amanda Cox and NYT Graphic Team: Chartsnthings blog
Comments