So I spent the whole day optimizing bird.makeup and got about a 5x in performance. Things are starting to run better, seems we are iterating on all accounts within 4-5 hours right now. I have other ideas that will probably get another 3x that I want to implement tonight, so performance should be right on target.
Until more people sign up of course! π
@iDeacon Bummer (it seems to be a bug, not a server load timeout issue...)
@vincent Any chance you have an idea for what's causing this? Sidekiq logs on my instance are showing "Mastodon::UnexpectedResponseError: https://bird.makeup/users/mkbhd/inbox returned code 500"
@vincent What are the system requirements, if any, for running your own instance?
My cloud server has 2 GB of RAM and it keeps running out of memory. I had to add a memory limit to the birdmakeup container in the docker-compose.yml file to keep the system stable (but now the container is killed/restarted every half hour or so).
When it works, performance is great though πβ Thanks so much for building this!
@smallsco Yeah, there is a memory leak somewhere... I'm hunting for it!
I just took a quick look at it (by the way: nice code). Three things potential things I found (no guarantees):
1. The used version of Newtonsoft has a potential memory leak: https://github.com/advisories/GHSA-5crp-9r3c-p9vr
2. MagicKey has a property `_rsa` which needs to be disposed. (there might be more)
3. Quite a few regex that might never complete.
I would need to actually run the application for my tools to find something, but maybe I'm lucky with the above.
Thank you so much for taking a look, hopefully one of these things holds the solution to the memory leak πβ
@DevWouter @smallsco Thanks for the compliment! But the credit goes to @BirdsiteLIVE .
I think I figured it out. I took a memory dump, and noticed large allocation related to SQL queries. When I removed Dapper and used vanilla npgsql the problem when away. It's really not obvious to me what was happening there, but it might be related to your number 1
@vincent Thank you! I will update my instance and see if the problem goes away as well πβ
@smallsco No problem! I just did the most frequent queries, so it's still leaking, just way more slowly
@vincent With the latest version, the CPU and network usage have gone up slightly, but there's been no change whatsoever that I'm seeing to the memory usage πβ
I updated the docker-compose.yml file to enforce a 1.5 GB memory limit (my server has 2 GB), which is what causes the consistent drops you see in the attached graphs:
-----
deploy:
resources:
limits:
memory: 1500M
-----
I'm surprized, I've never noticed memory leaks in the past when running the official instance (and running it on a 500MB VPS all this time), could it be linked to some of your fork modifications?
Keep me updated by your findings!
@BirdsiteLIVE @vincent @DevWouter
FWIW, as a user I opted to use the Birdmakeup fork because I didn't want to create a Twitter Developer account and use their API... I'm trying to get away from that site, not tie myself more to it π
I would consider giving the original Birdsitelive a try though if I can get away with running it on a smaller/cheaper instance.
@smallsco @BirdsiteLIVE @DevWouter I think I figured it out, try doing a docker pull to see how it behaves
@vincent @BirdsiteLIVE @DevWouter Thanks, just pulled, I'll let it run for a while and get back to you πβ
@vincent @BirdsiteLIVE @DevWouter Happy to report that the memory leak is fixed!!
My host is having some problems right now so I can't attach a fancy graph, but CPU usage has dropped from ~25% to ~4%, and memory utilization has remained flat at 28%. πβ Thanks so much for tracking this down!
@BirdsiteLIVE @DevWouter @smallsco Okay so the npgsql thing was not it at all. It's actually that TransformBlock will actually buffer data if you don't bound them.
See the commit that fixed it for me : https://git.sr.ht/~cloutier/bird.makeup/commit/5c75e79abc3cf377da4dbbe369a156ae9c3b890a
@vincent
Thank you for doing this.