Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caused by: java.io.IOException: Unable to connect to provided ports 10000~10010 #346

Open
wants to merge 76 commits into
base: revert-237-LIVY-255
Choose a base branch
from

Conversation

1074559124
Copy link

19/07/30 14:42:28 ERROR SessionServlet$: internal error
java.lang.RuntimeException: java.io.IOException: Unable to connect to provided ports 1000010010
at org.apache.livy.rsc.Utils.propagate(Utils.java:60)
at org.apache.livy.rsc.RSCClientFactory.createClient(RSCClientFactory.java:75)
at org.apache.livy.LivyClientBuilder.build(LivyClientBuilder.java:124)
at org.apache.livy.server.interactive.InteractiveSession$$anonfun$3.apply(InteractiveSession.scala:107)
at org.apache.livy.server.interactive.InteractiveSession$$anonfun$3.apply(InteractiveSession.scala:77)
at scala.Option.orElse(Option.scala:257)
at org.apache.livy.server.interactive.InteractiveSession$.create(InteractiveSession.scala:77)
at org.apache.livy.server.interactive.InteractiveSessionServlet.createSession(InteractiveSessionServlet.scala:56)
at org.apache.livy.server.interactive.InteractiveSessionServlet.createSession(InteractiveSessionServlet.scala:40)
at org.apache.livy.server.SessionServlet$$anonfun$16.apply(SessionServlet.scala:121)
at org.apache.livy.server.SessionServlet$$anonfun$16.apply(SessionServlet.scala:120)
at org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$$liftAction(ScalatraBase.scala:270)
at org.scalatra.ScalatraBase$$anonfun$invoke$1.apply(ScalatraBase.scala:265)
at org.scalatra.ScalatraBase$$anonfun$invoke$1.apply(ScalatraBase.scala:265)
at org.scalatra.ApiFormats$class.withRouteMultiParams(ApiFormats.scala:178)
at org.apache.livy.server.JsonServlet.withRouteMultiParams(JsonServlet.scala:39)
at org.scalatra.ScalatraBase$class.invoke(ScalatraBase.scala:264)
at org.scalatra.ScalatraServlet.invoke(ScalatraServlet.scala:49)
at org.scalatra.ScalatraBase$$anonfun$runRoutes$1$$anonfun$apply$8.apply(ScalatraBase.scala:240)
at org.scalatra.ScalatraBase$$anonfun$runRoutes$1$$anonfun$apply$8.apply(ScalatraBase.scala:238)
at scala.Option.flatMap(Option.scala:170)
at org.scalatra.ScalatraBase$$anonfun$runRoutes$1.apply(ScalatraBase.scala:238)
at org.scalatra.ScalatraBase$$anonfun$runRoutes$1.apply(ScalatraBase.scala:237)
at scala.collection.immutable.Stream.flatMap(Stream.scala:446)
at org.scalatra.ScalatraBase$class.runRoutes(ScalatraBase.scala:237)
at org.scalatra.ScalatraServlet.runRoutes(ScalatraServlet.scala:49)
at org.scalatra.ScalatraBase$class.runActions$1(ScalatraBase.scala:163)
at org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply$mcV$sp(ScalatraBase.scala:175)
at org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175)
at org.scalatra.ScalatraBase$$anonfun$executeRoutes$1.apply(ScalatraBase.scala:175)
at org.scalatra.ScalatraBase$class.org$scalatra$ScalatraBase$$cradleHalt(ScalatraBase.scala:193)
at org.scalatra.ScalatraBase$class.executeRoutes(ScalatraBase.scala:175)
at org.scalatra.ScalatraServlet.executeRoutes(ScalatraServlet.scala:49)
at org.scalatra.ScalatraBase$$anonfun$handle$1.apply$mcV$sp(ScalatraBase.scala:113)
at org.scalatra.ScalatraBase$$anonfun$handle$1.apply(ScalatraBase.scala:113)
at org.scalatra.ScalatraBase$$anonfun$handle$1.apply(ScalatraBase.scala:113)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.scalatra.DynamicScope$class.withResponse(DynamicScope.scala:80)
at org.scalatra.ScalatraServlet.withResponse(ScalatraServlet.scala:49)
at org.scalatra.DynamicScope$$anonfun$withRequestResponse$1.apply(DynamicScope.scala:60)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.scalatra.DynamicScope$class.withRequest(DynamicScope.scala:71)
at org.scalatra.ScalatraServlet.withRequest(ScalatraServlet.scala:49)
at org.scalatra.DynamicScope$class.withRequestResponse(DynamicScope.scala:59)
at org.scalatra.ScalatraServlet.withRequestResponse(ScalatraServlet.scala:49)
at org.scalatra.ScalatraBase$class.handle(ScalatraBase.scala:111)
at org.scalatra.ScalatraServlet.org$scalatra$servlet$ServletBase$$super$handle(ScalatraServlet.scala:49)
at org.scalatra.servlet.ServletBase$class.handle(ServletBase.scala:43)
at org.apache.livy.server.SessionServlet.org$scalatra$MethodOverride$$super$handle(SessionServlet.scala:39)
at org.scalatra.MethodOverride$class.handle(MethodOverride.scala:28)
at org.apache.livy.server.SessionServlet.org$scalatra$GZipSupport$$super$handle(SessionServlet.scala:39)
at org.scalatra.GZipSupport$$anonfun$handle$1.apply$mcV$sp(GZipSupport.scala:36)
at org.scalatra.GZipSupport$$anonfun$handle$1.apply(GZipSupport.scala:19)
at org.scalatra.GZipSupport$$anonfun$handle$1.apply(GZipSupport.scala:19)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.scalatra.DynamicScope$class.withResponse(DynamicScope.scala:80)
at org.scalatra.ScalatraServlet.withResponse(ScalatraServlet.scala:49)
at org.scalatra.DynamicScope$$anonfun$withRequestResponse$1.apply(DynamicScope.scala:60)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.scalatra.DynamicScope$class.withRequest(DynamicScope.scala:71)
at org.scalatra.ScalatraServlet.withRequest(ScalatraServlet.scala:49)
at org.scalatra.DynamicScope$class.withRequestResponse(DynamicScope.scala:59)
at org.scalatra.ScalatraServlet.withRequestResponse(ScalatraServlet.scala:49)
at org.scalatra.GZipSupport$class.handle(GZipSupport.scala:18)
at org.apache.livy.server.interactive.InteractiveSessionServlet.org$scalatra$servlet$FileUploadSupport$$super$handle(InteractiveSessionServlet.scala:40)
at org.scalatra.servlet.FileUploadSupport$class.handle(FileUploadSupport.scala:93)
at org.apache.livy.server.interactive.InteractiveSessionServlet.handle(InteractiveSessionServlet.scala:40)
at org.scalatra.ScalatraServlet.service(ScalatraServlet.scala:54)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Unable to connect to provided ports 10000
10010
at org.apache.livy.rsc.rpc.RpcServer.(RpcServer.java:101)
at org.apache.livy.rsc.RSCClientFactory.ref(RSCClientFactory.java:92)
at org.apache.livy.rsc.RSCClientFactory.createClient(RSCClientFactory.java:65)
... 82 more

40个请求同时提交livy session,报错,是不是并发只能是10000-10010 ,同时并发数最多10个,这个可以修改吗,能不能增大并发数

jerryshao and others added 30 commits November 30, 2016 17:46
* Update the docs related Livy build and Spark 2 support

Change-Id: Id59c628e73da46a124f718b192fc92336ac8eace

* Updating the doc

Change-Id: Ibdeb06fde53d359e77aace780a6093b2cc6a70b7
%table returns wrong header if input is a list of rows. Workaround the issue by converting row to dict.
- SparkYarnAppSpec.can kill spark-submit while it's running.
Added sparkSession to expose Spark 2.0 SparkSession in JobContext. If SparkSession is not supported, it will throw an exception.

Example:
JobHandle<String> handler = client.submit(new Job<String>() {
  @OverRide
  public String call(JobContext jc) throws Exception {
	SparkSession session = jc.sparkSession();
	return session.version();
  }
});
- When RSC connection times out, it should kill the application thru cluster manager to make sure nothing is leaked.
…domly. (#251)

- Redirect stderr to stdout to avoid synchronization between stdout and stderr.
…ter (#235)

Users can list both livy-repl_2.10 and livy-repl_2.11 jars and their dependencies in livy.repl.jars. Livy automatically picks the right jars according to Spark's Scala version and Scala jar versioning convention.
…ding (#252)

* LIVY-271. traceback could be json list or json object which is misleading
- InteractiveSessionSpec.should error out the session if the interpreter dies.
…me specific error. (#243)

- Instead of using regex, parse for internal frames manually.
- Repl in Scala 2.11 cleans internal frames internally. Skip internal frames cleaning.
InteractiveSessionSpec.should report an error if accessing an unknown variable
Change-Id: I1ff274c189212ad7c92dbaa597af68e947364b22
…e. (#247)

Notebook applications like Jupyter might crash while livy is running. No one will clean up its corresponding Livy session and will be leaked.
Heartbeat is added to address this. To keep a session alive, the notebook application must continously make GET requests to the interactive session. If no GET request is made within the heartbeat interval, livy-server will delete the session regardless to its state (busy, idle).

Heartbeat is per session and is controlled by session property "heartbeatTimeoutInSecond". Its default is 0 and it means heartbeat is disabled. To enable heartbeat, please create the session with non-zero "heartbeatTimeoutInSecond" in the create request.
* Add livy REST API to get Livy build info

Change-Id: I56c5116f6f74357d06ceabcad38ac5988003ccf5

* Change some typos

Change-Id: If47570766e19b98b88de521aa57dfae1ae5b48d9

* Address the comments

Change-Id: I961979534281e7bb5030a8c471e5245c7dedca82
* Make Livy server request log configurable

Change-Id: If880908aa1afb02522a9278ddda185715a1f98f0

* Address the comments

Change-Id: I04bb29bf873d7f6341afb42ad4aad34cab35900c
…started and expose the state of RSCClient (#268)

Change-Id: Iae8b04393c4c52d87e87bd451916d6191007db08
Currently livy.rsc.jars is a RSC configuration, which means user should specify this configuration each time when creating a session, this is semantically incorrect and inconvenient, also equivalent to livy.repl.jars, we should change this to Livy configuration.

Besides current doc uses livy.jars as a configuration name, no code honors this configuration name, this should also be updated.
…park. (#270)

HiveContext is always created no matter whether we enable hiveContext through spark.repl.enableHiveContext. The root cause is that we depends on shell.py of spark. and unfortunately HiveContext would not initialize itself when created, but defer its initialization until any methods is called. This change would call sqlContext.tables() to check whether hiveContext can work properly.
zjffdu and others added 30 commits February 17, 2017 15:50
Change-Id: Ibf549db71b0f6dfc2004c3ff602e92860a38a093
Currently a session timeout can't be configured below 1h. This is due to a hard-coded value that was missed in the clean up of LIVY-114 and LIVY-127.

Removed the limit and the conf value (default of 1h) will be used.
Specified fields for client test. Travis CI looks to be defaulted to python 2.7, so it never came across this issue.
Current session gc mechanism has some issues:
- Stopped session still needs to wait to timeout to gc-ed.
-Batch session will be gc-ed unexpectedly in run-time when timing out, which makes long running application impossible.
- Sometimes user doesn't want to stop idle sessions.

Changes of this commit:
- Never check the activity of batch session, which means batch session will only be gc-ed after stop.
- Add a configuration to turn off activity check for interactive session, which meets some usage scenarios.
- Add a configuration to control how long a finished session state will be kept in memory before cleaned out.
To avoid OOM for long running sessions, introduce statement retention mechanism to remove old statements.

Also refactor the statement state code to make it more clear.
…teractive sessions. (#297)

Passing driverProcess to SparkApp for interactive sessions so session/log will return spark-submit log.
- Added instructions to livy-client template.
- Filled out livy-client.conf.template.
- Updated conf naming to all camelCase, switched conf files to templates and updated .gitignore.
Updated pom files that used 4 space indent to use 2 space indent to match the remaining pom files.
* Change Hadoop dependencies to Apache Hadoop

Change-Id: I5500cc0061b03af1587700b7b0a7a147bbf0a333

* Change to use Apache Hadoop 2.7.3

Change-Id: Ied6a69e28fb4a9e4d1dcd97595c7ab253f1eaf82
- Livy server url is exposed as http URL even https is enabled, so we should handle this.
- Livy server SSL keystore password and key password currently set to same configurations, which should be separated.
Change-Id: I9daec6ac7fdb2c9f727cbab6db042ec144c4edb8
…urn 1 on error. (#311)

Currently if livy-server fails to start it will still return 0 (success), this has been fixed. I also added a livy-server status command that will return a status output similar to that of livy-server start when it's already running or livy-server stop when it's already stopped. Lastly I updated all output text to use livy-server instead of livy_server since that's the actual name of the script.

Tested manually.
* LIVY-313. Fixed SparkRInterpreter always returning success.

- Stopped redirecting stderr to stdout.
- Continue to read ErrorStream (it was only being read once).
- Checking for any errors returned by stderr before returning success.

* Fixing scalastyle check error

* Changing the way errors are handled in SparkRInterpreter

* Fixing scalastyle check error

* Updating SparkRSessionSpec
…300)

Updates to Livy configurations

- Added config deprecation with alternatives to ClientConf, HTTPConf, RSCConf, and LivyConf.
- Added framework for deprecation without alternatives when the need arises.
- Updated naming conventions in code and templates to use - instead of _ or camelCase and deprecated previous configs.
- Updated TestClientConf and added a new test.
…#312)

Some Spark packages are depending on scala-reflect 2.11.0 and it conflicts with Spark's scala version 2.11.8.
This's not Livy's fault actually but doesn't hurt to make Livy more fault tolerant.
Since the scala-reflect jar with the correct version must already be in CLASSPATH, fixed livy-repl to not load user supplied scala-reflect jars.
Change-Id: Ia8b635dbb9e8ef8e55bbe566967c5abaa5e07020
Change-Id: Id301749a5e678666df771aa8f02cce149b56ff93
…e issue (#323)

* Refactor statement progress tracker to fix binary compatible issue

Change-Id: Ie91fd77472aeebe138bd6711a0baa82269a6b247

* refactor again to simplify the code

Change-Id: I9380bcb8dd2b594250783633a3c68e290ac7ea28

* isolate statementId to job group logic

Change-Id: If554aee2c0b3d96b54804f94cbb8df9af7843ab4
…#319)

* squash of original ui poc code

* Initial All Sessions Page code

* finished all-sessions page and cleaned up code

* Moved metrics and added ui redirect, added no sessions message

* added conf to template, cleaned up missed code from last commit

* UI enable/disable no long configureable, always on

* nit

* Address review and fixed some html/css errors

* Fixed ITs and added a redirect to metrics when ui is disabled
* Add session information log when session is created

Change-Id: I8702ebe1d893cf328b4490e5b5d09f3afd02b7ce

* add more logs

Change-Id: I969edf4e680e59e3ecfbaa50431ae61bd03d4795

* simplify the code

Change-Id: If2c7876ddb6020ace3872c9ae639d69d58b02f48

* Address the comments

Change-Id: Iba5f10ad73ff67b97af8292769b254346b8ef7c8
…igurable (#329)

* Provide Jetty Http Request/Response Header Size Configuration

Kerberos over http requires larger header sizes than the default,
or the HTTP Error 413 Request entity too large will be returned.
This patch increases the default for Livy to 128K and also allows
this to be configurable.

* Change the conf key name

Change-Id: Id274c8cc60d30e5d778f9c447502b7e7a789a8f0
…ons (#331)

* Add unit test to verify large header size configuration

Change-Id: I6c231f9fc9773d1ea40313661b7c49ccfaa44796

* Style fix

Change-Id: I24e617f95fd3e45a674a6b5a691428f0fdabcd89

* Style fix

Change-Id: I91e71979499da8ebba41223b6fe41862de168d03

* Revert the changes

Change-Id: Id7bbd1b2378867c8534b900bc7ba9b1234a9b985

* Add configurations to livy.conf.template

Change-Id: I84d428869bc5cc22aa7f00c6c603ea4a6b052964
…ule six not found issue (#341)

Change-Id: I7ba64be56354be5a03e6f6b349990af6b662e457
…334)

* Code changes in RPCserver for user provided port

* Indentation Changes

* Indentation Changes

* Indentation Changes

* Indentation Changes

* Configuring Port Range

* Documentation Changed

* launcher.port.range will take care of launching RPC

* Checkstyle changes

* Checkstyle changes

* Dummy push

* Code changes

* Changed BindException Handling to SocketException Handling

* Changed Import Order

* Code changes to increase port range

* Set Port isConntect to true

* Indentation Changes & port range in livy-client.conf.template

* Indentation changes

* Changed visibilty of method private

* Indentation Changes

* Indenetation Changes

* Unit test case to test port range

* Checkstyle changes

* Unit test case for port range

* Added comment for Port Range Configuration and increase port range for unit test case
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.