Skip to content
This repository has been archived by the owner on May 27, 2020. It is now read-only.

Connections not closing #158

Open
AndyShih12 opened this issue Sep 7, 2016 · 1 comment
Open

Connections not closing #158

AndyShih12 opened this issue Sep 7, 2016 · 1 comment

Comments

@AndyShih12
Copy link

Using 0.12.0 with Spark streaming 2.0

Number of connections keeps building up...to a few thousand until MongoDB refuses to accept any new connections.

Is there any way to reuse the same connection? Or to force close the connection? Or to set a lifespan of the connections (i.e. 30seconds)? Thanks

@SMR2
Copy link

SMR2 commented Nov 21, 2016

This problem was fixed in 0.11.2 (at least for my part).
I updated to Spark 2 and 0.12.0 and now the problem is there again.

We export data from mongodb every night, but the connections to the mongodb are never closed, so after a time our spark collapses because we got tons of connection threads ...
Will you fix this in Version 0.12 ?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants