we have a job running on spark cluster, the job need to write to hbase. Since hbase connection is not sharable between nodes, I generate connection for worker to use. Now I have problem with when to close the hbase connection. According to hbase best practice, I should close it myself if I know when the worker ends. However, if the worker is killed by the driver or terminated by any error, will the hbase connection closed automatically? If not, where do I deal with hbase connection in this case. I know the connection leak is not good, but dont know how to avoid it.
Source: Stack Overflow