Skip to content

Commit

Permalink
docs: Fixed links
Browse files Browse the repository at this point in the history
  • Loading branch information
jruaux committed Dec 21, 2023
1 parent 6ecf871 commit 8f89e90
Show file tree
Hide file tree
Showing 7 changed files with 16 additions and 16 deletions.
2 changes: 1 addition & 1 deletion docs/guide/guide.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ asciidoctor {
jvmArgs("--add-opens", "java.base/sun.nio.ch=ALL-UNNAMED", "--add-opens", "java.base/java.io=ALL-UNNAMED")
}
attributes = [
'source-highlighter': 'prettify'
'source-highlighter': 'prettify'
]
}

Expand Down
1 change: 1 addition & 0 deletions docs/guide/src/docs/asciidoc/_links.adoc
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
:link_releases: link:https://github.com/redis-field-engineering/redis-kafka-connect/releases[releases page]
:link_redis_enterprise: link:https://redis.com/redis-enterprise-software/overview/[Redis Enterprise]
:link_lettuce_uri: link:https://github.com/lettuce-io/lettuce-core/wiki/Redis-URI-and-connection-details#uri-syntax[Redis URI Syntax]
:link_redis_notif: link:https://redis.io/docs/manual/keyspace-notifications[Redis Keyspace Notifications]
Expand Down
4 changes: 2 additions & 2 deletions docs/guide/src/docs/asciidoc/docker.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,11 @@ https://docs.docker.com/get-docker/[Docker]

== Run the example

Clone the https://github.com/{github-owner}/{github-repo}.git[{github-repo}] repository and execute `run.sh` in `docker` directory:
Clone the link:{project-scm}[github repository] and execute `run.sh` in `docker` directory:

[source,console,subs="attributes"]
----
git clone https://github.com/{github-owner}/{github-repo}.git
git clone {project-scm}
./run.sh
----

Expand Down
3 changes: 2 additions & 1 deletion docs/guide/src/docs/asciidoc/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,12 @@

include::{includedir}/_links.adoc[]

:leveloffset: 1
:leveloffset: +1
include::{includedir}/introduction.adoc[]
include::{includedir}/install.adoc[]
include::{includedir}/connect.adoc[]
include::{includedir}/sink.adoc[]
include::{includedir}/source.adoc[]
include::{includedir}/docker.adoc[]
include::{includedir}/resources.adoc[]
:leveloffset: -1
4 changes: 2 additions & 2 deletions docs/guide/src/docs/asciidoc/install.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Select one of the methods below to install {project-title}.

== Download

Download the latest release archive from https://github.com/{github-owner}/{github-repo}/releases[here].
Download the latest release archive from the link:{project-url}/releases[releases page].

== Confluent Hub

Expand All @@ -14,4 +14,4 @@ Download the latest release archive from https://github.com/{github-owner}/{gith

== Manually

Follow the instructions in {link_manual_install}
Follow the instructions in {link_manual_install}.
13 changes: 6 additions & 7 deletions docs/guide/src/docs/asciidoc/sink.adoc
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
[[_sink]]
= Sink Connector Guide
:name: Redis Kafka Sink Connector

The {name} consumes records from a Kafka topic and writes the data to Redis.
The sink connector consumes records from a Kafka topic and writes the data to Redis.
It includes the following features:

* <<_sink_at_least_once_delivery,At least once delivery>>
Expand All @@ -12,17 +11,17 @@ It includes the following features:

[[_sink_at_least_once_delivery]]
== At least once delivery
The {name} guarantees that records from the Kafka topic are delivered at least once.
The sink connector guarantees that records from the Kafka topic are delivered at least once.

[[_sink_tasks]]
== Multiple tasks

The {name} supports running one or more tasks.
The sink connector supports running one or more tasks.
You can specify the number of tasks with the `tasks.max` configuration property.

[[_sink_data_structures]]
== Redis Data Structures
The {name} supports the following Redis data-structure types as targets:
The sink connector supports the following Redis data-structure types as targets:

[[_collection_key]]
* Collections: <<_sink_stream,stream>>, <<_sink_list,list>>, <<_sink_set,set>>, <<_sink_zset,sorted set>>, <<_sink_timeseries,time series>>
Expand Down Expand Up @@ -168,10 +167,10 @@ The Kafka record value must be a number (e.g. `float64`) as it is used as the sa
[[_sink_data_formats]]
== Data Formats

The {name} supports different data formats for record keys and values depending on the target Redis data structure.
The sink connector supports different data formats for record keys and values depending on the target Redis data structure.

=== Kafka Record Keys
The {name} expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>:
The sink connector expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>:

[options="header",cols="h,1,1"]
|====
Expand Down
5 changes: 2 additions & 3 deletions docs/guide/src/docs/asciidoc/source.adoc
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
[[_source]]
= Source Connector Guide
:name: Redis Kafka Source Connector

The {name} includes 2 source connectors:
{project-title} includes 2 source connectors:

* <<_stream_source,Stream>>
* <<_keys_source,Keys>>
Expand All @@ -21,7 +20,7 @@ It includes the following features:

=== Delivery Guarantees

The {name} can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
The stream source connector can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
The default is at-least-once delivery.

[[_stream_source_at_least_once_delivery]]
Expand Down

0 comments on commit 8f89e90

Please sign in to comment.