2016

streamline your analyses linking R to sas and more: the workfloweR 2016/09/21

we all know R is the first choice for statistical analysis and data visualisation, but what about big data munging? tidyverse (or we’d better say hadleyverse 😏) has been doing a lot in this field, nevertheless it is often the case this kind of activities being handled from some other coding language. Moreover, sometimes you get as an input pieces of analyses performed with other kind of languages or, what is worst, piece of databases packed in proprietary format (like .

tags: ///////

Over 50 practical recipes for data analysis with R in one book 2016/05/11

Ah, writing a blog post! This is a pleasure I was forgetting, and you can guess it looking at last post date of publication: it was around january... you may be wondering: what have you done along this long time? Well, quite a lot indeed: changed my job ( I am now working @ Intesa Sanpaolo Banking Group on Basel III statistical models) became dad for the third time (and if you are guessing, it’s a boy!

tags: ///////////////

2015

how to list loaded packages in R: ramazon gets cleaver 2015/09/10

It was around midnight here in Italy: I shared the code on Github, published a post on G+, Linkedin and Twitter and then went to bed. In the next hours things got growing by themselves, with pleasant results like the following: https://twitter.com/DoodlingData/status/635057258888605696 The R community found ramazon a really helpful package. And I actually think it is: Amazon AWS is nowadays one of the most common tools for online web applications and websites hosting.

tags: //////////

ramazon: Deploy your Shiny App on AWS with a Function 2015/08/18

Because Afraus received a good interest, last month I override shinyapps.io free plan limits. That got me move my Shiny App on an Amazon AWS instance. Well, it was not so straight forward: even if there is plenty of tutorials around the web, every one seems to miss a part: upgrading R version, removing shiny-server examples… And even having all info it is still quite a long, error-prone process.

tags: ////////

Introducing Afraus: an Unsupervised Fraud Detection Algorithm 2015/07/02

The last Report to the Nation published by ACFE, stated that on average, fraud accounts for nearly the 5% of companies revenues. on average, fraud accounts for nearly the 5% of companies revenues [![Tweet: on average, fraud accounts for nearly the 5% of companies revenues. http://ctt.ec/u5E6x+](http://clicktotweet.com/img/tweet-graphic-4.png)](http://ctt.ec/q3j4X) Projecting this number for the whole world GDP, it results that the “fraud-country” produces something like a GDP 3 times greater than the Canadian GDP.

tags: /////////////

How to add a live chat to your Shiny app 2015/05/11

As I am currently working on a Fraud Analytics Web Application based on Shiny (currently on beta version, more later on this blog) I found myself asking: wouldn’t be great to add live chat support to my Web Application visitors? It would indeed! [caption id=“attachment_490” align=“aligncenter” width=“200”] an ancient example of chatting - Camera degli Sposi, Andrea Mantegna 1465 -1474[/caption] But how to do it? Unfortunately, looking on Google didn’t give any useful result.

tags: ////////

Catching Fraud with Benford's law (and another Shiny App) 2015/02/06

In the early ‘900 Frank Benford observed that ’1’ was more frequent as first digit in his own logarithms manual. More than one hundred years later, we can use this curious finding to look for fraud on populations of data. just give a try to the shiny app What ‘Benford’s Law’ stands for? Nice stuff, but what can I do with Benford’s Law? You can find fraud with it Some precautions BenfordeR: another lean shiny application performing a benford analysis plotting results detecting suspected records What’s next In the early ‘900 Frank Benford observed that ’1’ was more frequent as first digit in his own logarithms manual.

tags: /////////

2014

Querying Google With R 2014/11/19

If you have a blog you may want to discover how your website is performing for given keywords on Google Search Engine. As we all know, this topic is not a trivial one. Problem is that the analogycal solution would be quite time-consuming, requiring you to search your website for every single keyword, on many many pages. Feeling this way? [caption id=“attachment_273” align=“aligncenter” width=“300”] “Pain and fear, pain and fear for me” - Oliver Twist[/caption]

tags: ////////////