How long is “keep it around for quite a while”
How much of a window do you want to be able to query on interactively?
We have the concept of a capped collection where a table is defined with a maximum number of rows and as more rows are added the older rows drop off when the max size id reached. With this approach users can have the last X number of rows always available for full granularity query. So your 100 million rows a day would just land in the DB and be available until the table rolls them off when enough new data is loaded in.
Rather than update, why cant you just insert a new aggregated record for each unit of granularity of items your interested in. ie at the end of each hr run a query to generate the aggregates you are interested for the previous hr and insert it into a different