Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WebConsole] Sync to github manually #1073

Open
wants to merge 15 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 9 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
1 change: 1 addition & 0 deletions data_processing/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
target/
14 changes: 14 additions & 0 deletions data_processing/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
## Package
```shell
mvn clean scala:compile assembly:single
```

## Run
```shell
mvn scala:run -DmainClass=com.bytedance.aml.enterprise.Main
```

## Dependencies
* Spark 3.0.1
* Java 8
* Scala 2.12
115 changes: 115 additions & 0 deletions data_processing/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,115 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.bytedance.aml.enterprise</groupId>
<artifactId>sm4spark</artifactId>
<version>0.0.1-SNAPSHOT</version>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<spark.version>3.0.3</spark.version>

<java.version>1.8</java.version>
<scala.major.version>2.12</scala.major.version>
<scala.version>2.12.10</scala.version>

<maven.compiler.target>1.8</maven.compiler.target>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.release>8</maven.compiler.release>
</properties>

<dependencies>
<!-- Scala -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>

<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.15</version>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>

<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>

<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.3.0</version>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>

<!-- scala assembly-->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<finalName>${project.artifactId}-${project.version}-RELEASE</finalName>
<archive>
<manifest>
<mainClass>fully.qualified.MainClass</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
36 changes: 36 additions & 0 deletions data_processing/spark/csv_to_hive.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# Copyright 2023 The FedLearner Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

from pyspark.sql import SparkSession
from pyspark.sql.functions import col, lower


def run():
spark = SparkSession \
.builder \
.enableHiveSupport() \
.config('hive.exec.dynamic.partition', 'true') \
.config('hive.exec.dynamic.partition.mode', 'nonstrict') \
.getOrCreate()

df = spark.read.option('header', 'false') \
.csv('/home/byte_aml_tob/fedlearner_v2/njb/reduced.csv')
df = df.select(lower(col(df.columns[0])).alias('phone_sha256'))
df.write.mode('overwrite').insertInto('aml_tob.njb_intersection_sha256')
spark.stop()


if __name__ == '__main__':
run()
36 changes: 36 additions & 0 deletions data_processing/spark/hive_to_csv.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# Copyright 2023 The FedLearner Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

from pyspark.sql import SparkSession


def run():
spark = SparkSession \
.builder \
.enableHiveSupport() \
.getOrCreate()

df = spark.sql('SELECT DISTINCT uid FROM aml_tob.njb_intersection_uid WHERE uid IS NOT NULL')

# Partition automatically
df.write.format('csv').option('compression',
'none').option('header',
lixiaoguang01 marked this conversation as resolved.
Show resolved Hide resolved
'false').save('/home/byte_aml_tob/fedlearner_v2/njb/reduced_uid',
mode='overwrite')
spark.stop()


if __name__ == '__main__':
run()
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
/* Copyright 2023 The FedLearner Authors. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

lixiaoguang01 marked this conversation as resolved.
Show resolved Hide resolved
package com.bytedance.aml.enterprise

import org.apache.spark.sql.SparkSession


object Main {
def main(args: Array[String]) {
// trimmed
return
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
/* Copyright 2023 The FedLearner Authors. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.bytedance.aml.enterprise.sparkudaf

import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
import org.apache.spark.sql.expressions.UserDefinedFunction
import org.apache.spark.sql.functions

object Hist{
def getFunc: UserDefinedFunction = functions.udaf(HistUDAF, ExpressionEncoder[HistIn])

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
/* Copyright 2023 The FedLearner Authors. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.bytedance.aml.enterprise.sparkudaf

import org.apache.spark.sql.expressions.Aggregator
import org.apache.spark.sql.{Encoder, Encoders}

case class HistIn(var value: Double, var min: Double, var max: Double, var binsNum: Int, var interval: Double)
case class Bucket(var bins: Array[Double], var counts: Array[Int])

object HistUDAF extends Aggregator[HistIn, Bucket, Bucket]{

def zero: Bucket = Bucket(bins = new Array[Double](0), counts = new Array[Int](0))

def reduce(buffer: Bucket, data: HistIn): Bucket = {
if (buffer.bins.length == 0) {
buffer.bins = new Array[Double](data.binsNum + 1)
for (i <- 0 until data.binsNum) {
buffer.bins(i) = i * data.interval + data.min
}
buffer.bins(data.binsNum) = data.max
buffer.counts = new Array[Int](data.binsNum)
}
if (data.interval != 0.0){
var bucket_idx = ((data.value - data.min) / data.interval).toInt
if (bucket_idx < 0) {
bucket_idx = 0
} else if (bucket_idx > (data.binsNum - 1)){
bucket_idx = data.binsNum - 1
}
buffer.counts(bucket_idx) += 1
}
buffer
}


def merge(b1: Bucket, b2: Bucket): Bucket = {
(b1.bins.length, b2.bins.length) match {
case (_, 0) => b1
case (0, _) => b2
case _ => b1.counts = (b1.counts zip b2.counts) map (x => x._1 + x._2)
b1
}
}

def finish(reduction: Bucket): Bucket = reduction

def bufferEncoder: Encoder[Bucket] = Encoders.product

def outputEncoder: Encoder[Bucket] = Encoders.product

}
21 changes: 21 additions & 0 deletions docs/licenses/LICENCE-BurntSushi_toml.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
The MIT License (MIT)

Copyright (c) 2013 TOML authors

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
21 changes: 21 additions & 0 deletions docs/licenses/LICENCE-Go-Logrus.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
The MIT License (MIT)

Copyright (c) 2014 Simon Eskildsen

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
21 changes: 21 additions & 0 deletions docs/licenses/LICENCE-Go-Testify.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
MIT License

Copyright (c) 2012-2020 Mat Ryer, Tyler Bunnell and contributors.

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Loading