The story of Type Classes begins, as few great stories do, on Twitter, where Julie and Chris first met. To find out exactly when it began, in October 2016 we analyzedAt the time, Twitter provided its data exports in CSV format, and we used the cassava library to parse it. Chris’s tweet history data to find tweets directed from Chris to Julie.
Grouping the tweets as sums over ten-day ranges produced a histogram that looked like this:
2016-06-02 2
2016-06-12 6
2016-06-22 4
2016-07-02 0
2016-07-12 11
2016-07-22 6 2016-08-01 59
This nostalgic bit of code is now memorialized as a framed print in our office.
{-# LANGUAGE OverloadedStrings #-}
module Julie (main) where
import qualified Data.ByteString.Lazy as Bs
import qualified Data.Csv as Csv
import qualified Data.Text as Text
import qualified Data.Time as Time
import Control.Applicative (liftA2)
import Control.Monad (MonadPlus, guard, mfilter)
import Data.Function ((&))
import Data.Monoid ((<>))
import Data.Vector (Vector)
import Data.Text (Text)
import Data.Time (Day)
main :: IO ()
main = do
bs <- Bs.readFile "tweets.csv"
let parsed = (Csv.decode Csv.HasHeader bs) :: Either String (Vector [Text])
case parsed of
Left err -> putStrLn err
Right rows ->
let julieDays = findJulieDays rows
firstDay = minimum julieDays
lastDay = maximum julieDays
nextBin = Time.addDays 10
bins = firstDay & iterate nextBin & takeWhile (< lastDay)
in sequence_ $ do
bin <- bins
let count = julieDays
& mfilter (liftA2 (&&) (>= bin) (< nextBin bin))
& length
return $ putStrLn $ show bin <> " " <> show count
findJulieDays :: MonadPlus m => m [Text] -> m Day
findJulieDays rows = do
row <- rows
let timestamp = Time.parseTimeOrError False Time.defaultTimeLocale
"%Y-%m-%d %H:%M:%S %z" $ Text.unpack $ row !! 3
let text = row !! 5
guard $ "@argumatronic" `Text.isInfixOf` text
return timestamp
Four years later, we revisited this code to think about what we might have done differently if we were writing it today. Could it be written more clearly? More efficiently? With other parsing libraries? Are there other interesting things we could learn from the data? Come join us on a trip through the design space and down memory lane.