Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Who is using SeaTunnel? #686

Open
CalvinKirs opened this issue Nov 17, 2021 · 55 comments
Open

Who is using SeaTunnel? #686

CalvinKirs opened this issue Nov 17, 2021 · 55 comments
Assignees

Comments

@CalvinKirs
Copy link
Member

CalvinKirs commented Nov 17, 2021

Who is using SeaTunnel?
Sincerely thank everyone who constantly keeps on using and supporting SeaTunnel. We will try our best to make SeaTunnel better and make the community and ecology more prosperous.

The original intention of this issue
We’d like to listen to the community to make SeaTunnel better.
We want to attract more partners to contribute to SeaTunnel.
Learn more about the practical use scenarios of SeaTunnel to facilitate the next step of planning.
What we expect from you
Please submit a comment in this issue to include the following information:

your company, school, or organization.
your city and country.
your contact info: blog, email, Twitter (at least one).
for what business scenario do you use SeaTunnel.

You can refer to the following sample :

  • SeaTunnel Version:
  • Organization:
  • Location:
  • Contact:
  • Purpose:
  • WeChat/Slack:
  • Logo Url:
    Thanks again for your participation! Your support is the motivation for us to move forward.
    SeaTunnel community
@CalvinKirs CalvinKirs pinned this issue Nov 17, 2021
@cpaqyx
Copy link

cpaqyx commented Nov 17, 2021

Organization:Anchnet
Location:Shanghai China
Contact:[email protected]
Purpose:Build a company level big data processing platform to serve the data lake and data warehouse projects

@windAworld
Copy link

windAworld commented Nov 22, 2021

Organization:益心游戏
Location:guangzhou China
Contact:[email protected]
Purpose:Buried point data is etL and written to Clickhouse for real-time analysis

@hj5
Copy link

hj5 commented Nov 22, 2021

Organization:GridSum
Location:Beijing China
Contact:[email protected]
Purpose:Build the underlying execution engine for the data lake

@xleoken
Copy link
Member

xleoken commented Nov 26, 2021

Organization: CMCC
Location: SuZhou China
Contact: [email protected]
Purpose: A new kind of data collect product, we will use it in https://ecloud.10086.cn

@chenhu
Copy link
Contributor

chenhu commented Dec 16, 2021

Organization: CETC
Location: Nanjing China
Contact: [email protected]
Purpose: build the pipe line in data product

@xhmz
Copy link

xhmz commented Dec 24, 2021

Organization: Tencent Cloud
Location:wuhan China
Contact:[email protected]
Purpose: Analyze and learn design concepts

@WJustHM
Copy link

WJustHM commented Dec 24, 2021

Organization:HuanYa
Location:ChengDu China
Contact:[email protected]
Purpose: Build a bigdata processing analyze platform to datacenter

@superguhua
Copy link

Organization:数字广东
Location:GuangZhou China
Contact:[email protected]
Purpose: data pipeline and etl ,data lake

@lphcreat
Copy link

lphcreat commented Dec 24, 2021

Organization:碧桂园金融事业群
Location:XiAn China
Contact:[email protected]
Purpose: use this framework to build our big data warehouse

@zhangheng3
Copy link

zhangheng3 commented Dec 24, 2021

Organization:izuiyou
Location:BeiJing China
Contact:[email protected]
Purpose: Build the data pipeline and written to Clickhouse

@KeymanHuang
Copy link

Organization: Huya
Location: Guangzhou China
Contact: [email protected]
Purpose: Data pipeline for ingest and consume

@lfyee
Copy link

lfyee commented Dec 25, 2021

Organization: Bilibili
Location:Shanghai China
Contact:[email protected]
Purpose: Build the data pipeline tools in our data platform

@cpaqyx
Copy link

cpaqyx commented Dec 29, 2021

Organization: Anchnet
Location:Wuhan China
Contact:[email protected]
Purpose: Build Data Lake data, conversion and ETL tools.

@xtr1993
Copy link
Contributor

xtr1993 commented Dec 29, 2021

Organization: yoozoo
Location:Shanghai China
Contact:[email protected]
Purpose: Build DTS platform

@Xiaojun-Pan
Copy link

Xiaojun-Pan commented Dec 29, 2021

Organization: 无锡宝通科技
Location:Wuxi China
Contact:[email protected]
Purpose: build our big data warehouse

@TyrantLucifer
Copy link
Member

Organization: DeepZero
Location: Beijing China
Contact: [email protected]
Purpose: Build the data pipeline engine in our data platform

@suxinshuo
Copy link

Organization: Qunar
Location: Beijing China
Contact: [email protected]
Purpose: Export data from hive to Clickhouse.

@ZHAO-Yi-0511
Copy link

Organization: SongGuo
Location: Beijing China
Contact: [email protected]
Purpose: Data synchronization from Mysql to Hive.

@shangeyao
Copy link
Contributor

Organization:ShiHang Fresh
Location:SuZhou JiangSu China
Contact:[email protected]
Purpose:As a ETL tool for transforming heterogeneous data source into Spark

@Aaronoooooo
Copy link

Organization: AURORA
Location: ShenZhen GuangDong China
Contact: [email protected]
Purpose: Data synchronization from Hive to ClickHouse.

@wushanru
Copy link

wushanru commented Jan 7, 2022

Organization: ZHUOFANINFORMATION
Location:shanghai China
Contact:[email protected]
Purpose: Most data integration and ETL

@JiaShy94
Copy link

JiaShy94 commented Jan 7, 2022

Organization:Big Data Academy, Zhongke
Location:zhengzhou China
Contact:[email protected]
Purpose: Analyze and learn design concepts

@xiaobaiwalk
Copy link

Organization:EtonKids
Location:Beijing China
Contact:[email protected]
Purpose:Build a company level big data processing platform to data mining of user class.

@libailin
Copy link

Organization: Qihoo 360
Location: Beijing China
Contact: [email protected]
Purpose: ETL,data integration service

@lovingfeel
Copy link

lovingfeel commented Jan 17, 2022

Organization: 重庆电网
Location: ChongQing China
Contact: [email protected]
Purpose: ETL,data integration service

@asdf2014
Copy link
Member

Organization: Shopee
Location: Singapore
Contact: [email protected]
Purpose: Data transformation and synchronization

@bigdataf
Copy link
Contributor

Organization: oppo
Location: Beijing China
Contact: [email protected]
Purpose: ETL and data integration service

@zuoliansheng
Copy link

zuoliansheng commented Aug 19, 2022

Organization:www.songguo7.com
Location: Beijing, China
Contact: [email protected]
Purpose:Multi scene data synchronization, replace sqoop

@EricJoy2048 EricJoy2048 unpinned this issue Sep 21, 2022
@cnmac
Copy link
Contributor

cnmac commented Sep 26, 2022

Organization:DMALL
Location: Beijing, China
Contact: [email protected]; [email protected]
Purpose: Provides a way to develop ETL tasks in a web interface. SeaTunnel is the underlying task execution engine, so that users only need to be familiar with SQL to complete the development of ETL tasks.

@EricJoy2048 EricJoy2048 pinned this issue Oct 18, 2022
@dik111
Copy link
Contributor

dik111 commented Nov 1, 2022

Organization:suofeiya
Location: Guangzhou, China
Contact: [email protected]
Purpose: Provide data fusion services from multiple data sources.

@TKXZH
Copy link

TKXZH commented Nov 24, 2022

Organization:Coupang
Location: Shanghai, China; Seoul, Korea; Seattle, America
Contact: [email protected]
Purpose: Provide data sycn pipeline among aurora, hive, hbase, redshift and s3

@HsbcJone
Copy link
Contributor

Organization:digitalgd
Location: Guangzhou, China
Contact: [[email protected]]
Purpose: Hadoop & Multi scene data synchronization

@ahuljh
Copy link

ahuljh commented Apr 3, 2023

Organization: Boulderaitech
Location: Hangzhou, China
Contact: [email protected]
Purpose: sync data from db to hdfs

@HaoXuAI
Copy link
Contributor

HaoXuAI commented Apr 4, 2023

Organization: JP Morgan & Chase
Location: California, US
Contact: [email protected]
Purpose: Use Seatunnel as a central data ingestion platform to ingest data to datalake.

@kpretty
Copy link
Contributor

kpretty commented Apr 27, 2023

Organization: 杭州超级科技
Location: Hangzhou, China
Contact: [email protected]
Purpose: Data integration and ETL

@zengqinchris
Copy link

Organization: 深圳云镝智慧科技
Location: shenzhen, China
Contact: [email protected]
Purpose: Data integration and ETL

@ningyanhui
Copy link

Organization: 重庆觉晓科技有限公司
Location: ChongQing, China
Contact: [email protected]
Purpose: Data integration and ETL

@LucasBeckhan
Copy link

Organization: 中企通信
Location:ShangHai, China
Contact: [email protected]
Purpose: Data integration and ETL

@reddyafk
Copy link

reddyafk commented May 8, 2023

Organization: 爱玛科技
Location:TianJin, China
Contact: [email protected]
Purpose: Data integration and ETL

@linghuliz
Copy link

Organization: 爱保科技
Location: BeiJing, China
Contact: [email protected]
Purpose: Data integration and ETL

@lucklilili
Copy link
Contributor

Organization: 卫宁科技
Location: BeiJing, China
Contact: [email protected]
Purpose: Assist in data exchange

@riruigit
Copy link

Organization: 风华高科
Location: ZhaoQing, China
Contact: [email protected]
Purpose: ETL tool , load data to data warehouse

@nicknin8008
Copy link

nicknin8008 commented May 18, 2023

Organization: 京东科技
Location: BeiJing, China
Contact: [[email protected]][[email protected]]
Purpose: data exchange tool test, hive to Clickhouse massive data sync

@liuxu2630
Copy link

Organization: 车未来科技
Location: TianJin, China
Contact: [email protected]
Purpose: Build the data pipeline tools in our data platform

@wcc1433
Copy link

wcc1433 commented May 19, 2023

Organization: 常州数方科技有限公司
Location: 江苏省常州市
Contact: [email protected]
Purpose: 数据集成

@wenyongning1
Copy link

Organization: 四川智锂智慧能源科技有限公司
Location: SiChuan, China
Contact: [email protected]
Purpose: Data integration,Build the data pipeline tools in our data platform

@sjwh07
Copy link

sjwh07 commented May 19, 2023

Organization: 杭银消费金融公司
Location: Hangzhou China
Contact: [email protected]
Purpose: Build the data pipeline and written to Doris

@1553010490
Copy link

Organization: Boulderaitech
Location: Hangzhou, China
Contact: [email protected]
Purpose: sync data from mysql to hive

@ConnorZJ
Copy link

Organization: 深圳萨摩耶科技有限公司
Location: ShenZhen, China
Contact: [email protected]
Purpose: hive and clickhouse massive data sync

@min918
Copy link

min918 commented Jun 2, 2023

Organization: CMSR
Location: ShangHai, China
Contact: [email protected]
Purpose: Build a bigdata processing analyze platform to datacenter

@mosence
Copy link
Contributor

mosence commented Sep 26, 2023

Organization: 海南数造科技有限公司
Location: Hainan Haikou, China
Contact: [email protected]
Purpose: Build an integrated development platform for data development, data governance, and data services.

@EricJoy2048 EricJoy2048 unpinned this issue Oct 26, 2023
@lightzhao
Copy link
Contributor

Organization: 58集团
Location: BeiJing, China
Contact: [email protected]
Purpose: Upgrade real-time data integration platform.

@YuriyGavrilov
Copy link

YuriyGavrilov commented Nov 29, 2023

Organization: S7 Airlines
Location: Moscow, Russia
Contact: [email protected]; https://gavrilov.info; http://twitter.com/YuriyGavrilov
Purpose: as a part of Enterprise Data Platform, currently trying to test and adopt SeaTunnel, wants to use it in data mesh with Kafka and Trino and other DBs for data sink and some etl.

@hailin0 hailin0 pinned this issue Nov 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests