A lot of people have suggested to increase the performance of MoinMoin. On thing discussed was to move away from CGI because it forces to load everything - from the python interpreter over the modules to the page data - for every request. But persistance and a real Wikiserver would have to deal with concurrency and a lot of problems resulting from that.
- which?
- You have to protect every data share between the threads with semaphores or something like that. This can get really complicated because it in not enough to protect each attribute, but you have to lock it between every read and write.
I want to suggest a middle way: use CGI for the concurrency and one process for page rendering.
The new CGI would only put the request in a queue and wait for a response from the WikiServer. The WikiServer reads one request after the other and sends the page back to the CGI. Because the WikiServer is single threaded it can be programmed very easily and would only exist of the normal MoinMoin code and a replacement of maincgi.py. Caching page or user objects would be easy. If every object would check if its files have changed, it would be possible to run several Wikiservers at once to use more than one processor.
- what is better with multiple "wikiservers" compared to a moinmoin with persistance (see above)?
- I think it is much easier to implement. Of cause having a Wiki with real persistance and real threading is better.
Of cause this will need some minor changes to the MoinMoin code base, but it should be possible to run MoinMoin in server mode and in CGI mode with sharing 95% of the code. Especially the edit operations will continue to work as they do now.
Communication between the CGI and the WikiServer could be implemented using:
- Files
- Pipes
- Sockets
Example:
- CGI writes request into a file in the request directory
- CGI opens named pipe (name given in request)
- Wikiserver removes request file
- writes page content into named pipe
Advantages:
- no time for loading
- python
- modules
- pages
- Page structure could be cached in memory (if it is cached on disk it must also be read and parsed...)
- good basis to develop a multithreaded server later on
Problems:
- needs more memory
- who talks about memory nowadays?
- much more - perhaps dozend even hundreds of times more
- maybe higher latency
deadlocks. If a WikiPage reads another page and waits for the result everything halts.
MoinMoin code can no longer access CGI parameters directly
- Sounds too simple. I must have overseen something.
Any suggestions?