Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove application from pool on timeout #175

Closed

Conversation

nonrational
Copy link
Member

@nonrational nonrational commented Sep 15, 2018

This PR addresses #59.

On successful SIGTERM, Kill() doesn't remove the application from the pool. As a result, on the next request, puma-dev thinks the app is already running and doesn't boot it. With this patch, we ensure that puma-dev boots a new puma process on the next request.

@gkatsanos
Copy link

We faced the same issue.

@nateberkopec
Copy link
Member

Thanks for tracking this down! Please add a test.

@nonrational
Copy link
Member Author

nonrational commented Aug 21, 2019

fair enough @nateberkopec

would you be open to merging #188 ?

AFAIK it's necessary to build with go 1.10+ go 1.10.4+. without it, on go1.11.4 darwin/amd64, i get

BUILD| github.com/puma/puma-dev/vendor/github.com/fsnotify/fsevents
     | vendor/github.com/fsnotify/fsevents/wrap.go:115:183: cannot use nil as type _Ctype_CFAllocatorRef in argument to func literal
     | vendor/github.com/fsnotify/fsevents/wrap.go:122:168: cannot use nil as type _Ctype_CFAllocatorRef in argument to func literal
     | vendor/github.com/fsnotify/fsevents/wrap.go:175:44: cannot use nil as type _Ctype_CFAllocatorRef in argument to _Cfunc_CFStringCreateWithCString
FAIL | 	github.com/puma/puma-dev/dev [build failed]

@nateberkopec
Copy link
Member

Sure, I'll take a look and repro. If I can get the tests running should be a simple merge.

@nateberkopec
Copy link
Member

@nonrational Fixed the dep, Travis should be running soon too. Back to you 🙇

@nonrational
Copy link
Member Author

nonrational commented Aug 21, 2019

Before going wild trying to mock AppPool and some of the other low-level structs necessary to test this approach, I tried to reproduce this issue on the latest master and have been unable to do so.

I have two rails applications, myapp and hello-world using different rails and puma versions.

# $ sw_vers; ruby -v; puma --version; rails --version
# ProductName:    Mac OS X
# ProductVersion: 10.14.6
# BuildVersion:   18G84
# ruby 2.6.3p62 (2019-04-16 revision 67580) [x86_64-darwin18]
# puma version 4.1.0
# Rails 5.2.3

# $ sw_vers; ruby -v; puma --version; rails --version
# ProductName:    Mac OS X
# ProductVersion: 10.14.6
# BuildVersion:   18G84
# ruby 2.6.3p62 (2019-04-16 revision 67580) [x86_64-darwin18]
# puma version 3.12.1
# Rails 6.0.0

I think run the following on go version go1.11.4 darwin/amd64:

#!/usr/bin/env bash
# reproduce.sh

# clean up and build a brand new puma-dev binary
rm -f ./puma-dev
make

# remove old logs
mv -v ~/Library/Logs/puma-dev.log ~/Library/Logs/puma-dev.log.$(date '+%Y%m%d%H%M%S')

# ensure we don't have a lingering installation against an old binary
./puma-dev -uninstall -d test:localhost

# install with a 5 second timeout to make testing quicker
./puma-dev -d test:localhost -timeout 5s -install
sudo ./puma-dev -d test:localhost -timeout 5s -setup

# wait for daemon to fully init
echo "Sleeping for 5s..." && sleep 5

# curl two different apps
curl http://hello-world.localhost/ > /dev/null
curl http://myapp.localhost/ > /dev/null

# wait for the timeout and a little
echo "Sleeping for 10s..." && sleep 10

# check that both apps received boot and cleanup
cat ~/Library/Logs/puma-dev.log | grep 'booted\|cleaned'

and here's the output:

$ ./reproduce.sh

go build ./cmd/puma-dev
/Users/user/Library/Logs/puma-dev.log -> /Users/user/Library/Logs/puma-dev.log.20190821101255
* Removed puma-dev from automatically running
* Removed domain 'test'
* Removed domain 'localhost'
* Use '/Users/user/go/src/github.com/puma/puma-dev/puma-dev' as the location of puma-dev
* Installed puma-dev on ports: http 80, https 443
* Configuring /etc/resolver to be owned by user
* Changing '/etc/resolver/localhost' to be owned by user
* Changing '/etc/resolver/test' to be owned by user
Sleeping for 5s...
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   615  100   615    0     0    125      0  0:00:04  0:00:04 --:--:--   166
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  3526    0  3526    0     0    802      0 --:--:--  0:00:04 --:--:--   803
Sleeping for 10s...
! App 'hello-world' booted
! App 'myapp' booted
* App 'hello-world' shutdown and cleaned up
* App 'myapp' shutdown and cleaned up

The fact that both applications get the "... and cleaned up" means we got here https://github.com/nonrational/puma-dev/blob/master/dev/app.go#L136-L146 which indicates that the application was correctly cleaned up.

I invite others experiencing this issue to try to repro on master. I'm going to try to repro with an older ruby and puma. If I'm unable to reproduce, will close this PR.

@nonrational
Copy link
Member Author

failed to reproduce with

$ ~/go/src/github.com/puma/puma-dev/puma-dev -V; sw_vers; ruby -v; puma --version; rails --version
Version: devel (go1.11.4)
ProductName:	Mac OS X
ProductVersion:	10.14.6
BuildVersion:	18G84
ruby 2.5.5p157 (2019-03-15 revision 67260) [x86_64-darwin18]
puma version 3.12.0
Rails 6.0.0

@nonrational
Copy link
Member Author

failed to reproduce with

$ ~/go/src/github.com/puma/puma-dev/puma-dev -V; sw_vers; ruby -v; puma --version; rails --version
Version: devel (go1.11.4)
ProductName:	Mac OS X
ProductVersion:	10.14.6
BuildVersion:	18G84
ruby 2.4.1p111 (2017-03-22 revision 58053) [x86_64-darwin18]
puma version 3.12.1
Rails 5.2.3

closing. 🙌

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants