Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

translate SQL-writing-language-integrated-relational-queries #36

Merged
merged 1 commit into from
Mar 13, 2015
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 6 additions & 7 deletions spark-sql/writing-language-integrated-relational-queries.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# 编写语言集成(Language-Integrated)的相关查询
# [編寫語言整合(Language-Integrated)的關聯性查詢](https://spark.apache.org/docs/latest/sql-programming-guide.html#writing-language-integrated-relational-queries)

语言集成的相关查询是实验性的,现在暂时只支持scala。
**語言整合的關聯性查詢是實驗性的,現在暂时只支援scala。**

Spark SQL也支持用领域特定语言编写查询。
Spark SQL也支持用特定領域的語法編寫查詢語句,請參考使用下列範例的資料:

```scala
// sc is an existing SparkContext.
Expand All @@ -11,11 +11,10 @@ val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
val people: RDD[Person] = ... // An RDD of case class objects, from the first example.

// The following is the same as 'SELECT name FROM people WHERE age >= 10 AND age <= 19'
// 下述等同於 'SELECT name FROM people WHERE age >= 10 AND age <= 19'
val teenagers = people.where('age >= 10).where('age <= 19).select('name)
teenagers.map(t => "Name: " + t(0)).collect().foreach(println)
```

DSL使用Scala的符号来表示在潜在表(underlying table)中的列,这些列以前缀(')标示。将这些符号隐式转换成由SQL执行引擎计算的表达式。你可以在[ScalaDoc](https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.SchemaRDD)
中了解详情。

DSL使用Scala的符號來表示在潜在表(underlying table)中的列,這些列以前缀(')標示。將這些符號透過隱式轉成由SQL執行引擎計算的表達式。你可以在[ScalaDoc](https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.SchemaRDD)
中了解詳情。