how to handle hbase connection closure on spark when worker shuts down in scala

By weifeng zhang

we have a job running on spark cluster, the job need to write to hbase. Since hbase connection is not sharable between nodes, I generate connection for worker to use. Now I have problem with when to close the hbase connection. According to hbase best practice, I should close it myself if I know when the worker ends. However, if the worker is killed by the driver or terminated by any error, will the hbase connection closed automatically? If not, where do I deal with hbase connection in this case. I know the connection leak is not good, but dont know how to avoid it.
Thanks.

Source: Stack Overflow

    

Share it with your friends!

    Fatal error: Uncaught Exception: 12: REST API is deprecated for versions v2.1 and higher (12) thrown in /home/content/19/9652219/html/wp-content/plugins/seo-facebook-comments/facebook/base_facebook.php on line 1273