Skip to content

Commit

Permalink
[Docs] Update package download links in docs (#515)
Browse files Browse the repository at this point in the history
Signed-off-by: chenxu <[email protected]>
Co-authored-by: chenxu <[email protected]>
  • Loading branch information
xuchen-plus and dmetasoul01 authored Jul 22, 2024
1 parent 84701f6 commit b481925
Show file tree
Hide file tree
Showing 6 changed files with 14 additions and 9 deletions.
1 change: 1 addition & 0 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -160,6 +160,7 @@ SPDX-License-Identifier: Apache-2.0
<serverId>ossrh</serverId>
<nexusUrl>https://s01.oss.sonatype.org/</nexusUrl>
<autoReleaseAfterClose>true</autoReleaseAfterClose>
<stagingProgressTimeoutMinutes>15</stagingProgressTimeoutMinutes>
</configuration>
</plugin>
<plugin>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ LakeSoul 发布 jar 包可以从 GitHub Releases 页面下载:https://github.c
wget https://github.com/lakesoul-io/LakeSoul/releases/download/vVAR::VERSION/lakesoul-spark-3.3-VAR::VERSION.jar -P $SPARK_HOME/jars
```

如果访问 Github 有问题,也可以从如下链接下载:https://dmetasoul-bucket.obs.cn-southwest-2.myhuaweicloud.com/releases/lakesoul/lakesoul-spark-3.3-VAR::VERSION.jar
如果访问 Github 有问题,也可以从如下链接下载:https://mirrors.huaweicloud.com/repository/maven/com/dmetasoul/lakesoul-spark/3.3-VAR::VERSION/lakesoul-spark-3.3-VAR::VERSION.jar

:::tip
从 2.1.0 版本起,LakeSoul 自身的依赖已经通过 shade 方式打包到一个 jar 包中。之前的版本是多个 jar 包以 tar.gz 压缩包的形式发布。
Expand Down Expand Up @@ -95,6 +95,8 @@ spark.sql.defaultCatalog | lakesoul
### 1.4 Flink 本地环境搭建
以当前发布最新版本为例,LakeSoul Flink jar 包下载地址为:https://github.com/lakesoul-io/LakeSoul/releases/download/vVAR::VERSION/lakesoul-flink-1.17-VAR::VERSION.jar

如果访问 Github 有问题,也可以从如下链接下载:https://mirrors.huaweicloud.com/repository/maven/com/dmetasoul/lakesoul-flink/1.17-VAR::VERSION/lakesoul-flink-1.17-VAR::VERSION.jar

最新版本支持 flink 集群为1.17,Flink jar下载地址为:https://dlcdn.apache.org/flink/flink-1.17.2/flink-1.17.2-bin-scala_2.12.tgz

#### 1.4.1 启动Flink SQL shell
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ export LAKESOUL_PG_PASSWORD=root
## SQL
### 下载LakeSoul Flink Jar
可以在 LakeSoul Release 页面下载: https://github.com/lakesoul-io/LakeSoul/releases/download/vVAR::VERSION/lakesoul-flink-flink-1.17-VAR::VERSION.jar.
如果访问 Github 有问题,也可以通过这个链接下载:https://dmetasoul-bucket.obs.cn-southwest-2.myhuaweicloud.com/releases/lakesoul/lakesoul-flink-flink-1.17-VAR::VERSION.jar
如果访问 Github 有问题,也可以通过这个链接下载:https://mirrors.huaweicloud.com/repository/maven/com/dmetasoul/lakesoul-flink/1.17-VAR::VERSION/lakesoul-flink-1.17-VAR::VERSION.jar

### 使用SQL Client
```bash
Expand Down Expand Up @@ -393,4 +393,4 @@ SET execution.runtime-mode = batch;
| readstarttime | 起始读时间戳,如果未指定起始时间戳,则默认从起始版本号开始读取 | 'readstarttime'='2023-05-01 15:15:15' |
| readendtime | 结束读时间戳,如果未指定结束时间戳,则默认读取到当前最新版本号 | 'readendtime'='2023-05-01 15:20:15' |
| timezone | 时间戳的时区信息,如果不指定时间戳的时区信息,则默认为按本机时区处理 | 'timezone'='Asia/Sahanghai' |
:::
:::
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ $FLINK_HOME/bin/start-cluster.sh
--server_time_zone UTC
```

其中 lakesoul-flink 的 jar 包可以从 [Github Release](https://github.com/lakesoul-io/LakeSoul/releases/) 页面下载。如果访问 Github 有问题,也可以通过这个链接下载:https://dmetasoul-bucket.obs.cn-southwest-2.myhuaweicloud.com/releases/lakesoul/lakesoul-flink-1.17-VAR::VERSION.jar
其中 lakesoul-flink 的 jar 包可以从 [Github Release](https://github.com/lakesoul-io/LakeSoul/releases/) 页面下载。如果访问 Github 有问题,也可以通过这个链接下载:https://mirrors.huaweicloud.com/repository/maven/com/dmetasoul/lakesoul-flink/1.17-VAR::VERSION/lakesoul-flink-1.17-VAR::VERSION.jar

http://localhost:8081 Flink 作业页面中,点击 Running Job,进入查看 LakeSoul 作业是否已经处于 `Running` 状态。

Expand Down Expand Up @@ -203,4 +203,4 @@ INSERT INTO mysql_test_2 VALUES ('Bob', '10010');
```

LakeSoul 中也成功同步并读取到新表的数据:
![](spark-read-after-new-table.png)
![](spark-read-after-new-table.png)
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ spark-submit --jars "lakesoul-spark-3.3-VAR::VERSION.jar"

Jar 包可以从 Github Release 页面下载:https://github.com/lakesoul-io/LakeSoul/releases/download/vVAR::VERSION/lakesoul-spark-3.3-VAR::VERSION.jar

或者从国内地址下载:https://dmetasoul-bucket.obs.cn-southwest-2.myhuaweicloud.com/releases/lakesoul/lakesoul-spark-3.3-VAR::VERSION.jar
或者从国内地址下载:https://mirrors.huaweicloud.com/repository/maven/com/dmetasoul/lakesoul-spark/3.3-VAR::VERSION/lakesoul-spark-3.3-VAR::VERSION.jar

### 设置 Java/Scala 项目
增加以下 Maven 依赖项:
Expand Down Expand Up @@ -141,6 +141,8 @@ taskmanager.memory.task.off-heap.size: 3000m
### 添加 LakeSoul Jar 到 Flink 部署的目录
从以下地址下载 LakeSoul Flink Jar:https://github.com/lakesoul-io/LakeSoul/releases/download/vVAR::VERSION/lakesoul-flink-1.17-VAR::VERSION.jar

或者从国内地址下载:https://mirrors.huaweicloud.com/repository/maven/com/dmetasoul/lakesoul-flink/1.17-VAR::VERSION/lakesoul-flink-1.17-VAR::VERSION.jar

并将 jar 文件放在 `$FLINK_HOME/lib` 下。在此之后,您可以像往常一样启动 flink 会话集群或应用程序。

:::tip
Expand All @@ -162,4 +164,4 @@ export HADOOP_CLASSPATH=`$HADOOP_HOME/bin/hadoop classpath`
<artifactId>lakesoul-flink</artifactId>
<version>1.17-VAR::VERSION</version>
</dependency>
```
```
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ LakeSoul 自 2.1.0 版本起,实现了 Flink CDC Sink,能够支持 Table API
### 1. 下载 LakeSoul Flink Jar
可以在 LakeSoul Release 页面下载:https://github.com/lakesoul-io/LakeSoul/releases/download/vVAR::VERSION/lakesoul-flink-1.17-VAR::VERSION.jar。

如果访问 Github 有问题,也可以通过这个链接下载:https://dmetasoul-bucket.obs.cn-southwest-2.myhuaweicloud.com/releases/lakesoul/lakesoul-flink-flink-1.17-VAR::VERSION.jar
如果访问 Github 有问题,也可以通过这个链接下载:https://mirrors.huaweicloud.com/repository/maven/com/dmetasoul/lakesoul-flink/1.17-VAR::VERSION/lakesoul-flink-1.17-VAR::VERSION.jar

目前支持的 Flink 版本为 1.17。

Expand Down Expand Up @@ -295,4 +295,4 @@ MongoDB与LakeSoul的类型映射关系
5. Postgres需设置wal_level = logical
6. Postgres为获取Update完整事件信息,需执行alter table tablename replica identity full
7. Oracle需要为同步的表开启增量日志:ALTER TABLE inventory.customers ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
8. Oracle传表的时候应避免使用schema.*的形式传入多表。
8. Oracle传表的时候应避免使用schema.*的形式传入多表。

0 comments on commit b481925

Please sign in to comment.