Caught in an iconoclastic mood, I would like to challenge some conventional thinking: that distributed system architecture should favor lots of little independent services, each doing one thing, and doing one thing well.
The conservation of complexity principle (Brooks, No Silver Bullet) suggests that whenever we think that we are simplifying something, we are really just pushing the complexity around to somewhere where it is less visible, and less easily managed.
I think that this is sometimes the case here - If you have really good operations & sysadmin resources, and limited developer resources, then the lots-of-little processes architecture is probably appropriate.
If, on the other hand, you have good developer resources, and limited operations & sysadmin resources, then all that you are doing is shifting the complexity to someplace where you are lack the tools to manage and control it.
In summary, the quality of good architecture depends more on how it complements and supports the capabilities of the team and toolset than it does on any fundamental natural law.
Software engineering is a social science, not a hard engineering discipline, nor a mathematical or physical science.