Simple server-side e-mail aggregation setup
Clone
HTTPS:
git clone https://vervis.peers.community/repos/xobxE
SSH:
git clone USERNAME@vervis.peers.community:xobxE
Branches
Tags
master
::
TODO
- SECURITY: The script just `cat`s together the system config and the user
config, which means the user config can change any variable - make sure it's
safe and harmless (e.g. can't cause ddos on some other server), otherwise
modify the script to `sed` out the harmful parts or similar
- SECURITY: Consider running the cron job as new user 'doar', not as root
- Make the script proceed even if one mpop call fails, so that not all users
suffer a single user's badly written account definition. Preferrably make the
script send an e-mail to that user, with the error message.
- Add support for RSS aggregation, check Debian packages feed2imap and
rss2email. Another option is to combine many feeds into one, but then the
various client machines you use don't share read/unread status. IDEA: Feeds
can be shared by pulling them into a *public* Citadel folder rather than a
private one. Not only it reflects the interests of the community and helps
people find content, it also saves space by holding just a single copy of the
feed items on the server, rather than one per user. Actually I wonder how the
read/unread status works then - does Citadel keep it separate for each user?
Judging by what I saw in citadel.org BBS, it does.
- Make the shell script require `sh`, not `bash`, and avoid Bash-specific
features.
- Choose reasonable defaults and add support for LDAP and SQL for user
information
- Add all the standard files required by Debian packages
- Add support for variable aggregation frequency
- See if it makes sense to rewrite in C/C++ when the server has hundreds of
users, i.e. see how much time the script takes to execute then
### Issue: aggregation frequency
How often should it happen? And should the user be able to control it? What
do I do if users can have different frequencies? Is one cron job still
enough, or do I install a new cron job per user instead? Ideas:
- One cron job runs every 10 minutes, and rates must be multiples of 10m
- One cron job, same rate for all users
- Cron job per user in user's crontab
- Root has cron jobs for the users, to prevent user changes via ssh
This script is assumed to use same rate for all users and run as a system-wide
job.