Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Development Challenge #143

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
95 changes: 95 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
CSVConverter 0.1.0
==============

Description
--------------

Web based application accepting via a form comma separated file
with the following columns: date, category, employee name, employee address, expense description, pre-tax amount, tax name, and tax amount.


After upload, application processes the file, stores data in in-memory relational database
and displays a table of the total expenses amount per-month represented by the uploaded file.


The following assumptions are taken:


i. Columns will always be in that order.

ii. There will always be data in each column.

iii. There will always be a header line.


*Notes:*

*The application allows parallel processing of the files.*

*The data stored in the in-memory database is discarded after the application terminates.*

*The records in the uploaded files are not uniquely identified and are not de-duplicated in the database. This may cause accumulation of the monthly expenses returned by the application when multiple uploads of the same or overlapping data are performed.*

*The previously uploaded copies of the files are cleaned up when application starts.*


Build and run instructions
---------------------------

To **build** the application you will need

JDK 1.8 or later.

Maven 3.0+.


To build the application execute:

**mvn clean install**

To create a stand alone application package run:

**mvn package**

(the build JAR file will be located in the target directory.)


Both commands must be executed from the root application directory where *pom.xml* file resides.


To **run** the application you must have

Java 1.8 or later.

The application requires write access to the local directory.


To run with Maven execute:

**mvn spring-boot:run**

from the root application directory.


Alternatively to run the pre-packaged JAR file execute:

**java -jar <path>/csvconverter-0.1.0.jar**


No preliminary steps are necessary to run the application.

Once launched it can be access with you web browser (tested with Firefox and Safari) under the following URI:

**http://localhost:8080**


The application can be terminated from the command line by sending SIGTERM signal with CTRL+C.


Implementation notes
----------------------

I have completed the application within couple of nights and enjoyed every step of its development. I was particularly happy to put hands on Spring Batch projects - something I was planning to do for a while bug have not had a good reason to do it. I believe with Spring Batch and Spring Boot we can quickly build well structured applications separating different concerns and leveraging best OO practices.

My TODOs for the next version would include improved error handling, better Unit Test coverage and separation of data between different
web session contexts.
61 changes: 61 additions & 0 deletions pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.wave</groupId>
<artifactId>csvconverter</artifactId>
<version>0.1.0</version>

<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.4.3.RELEASE</version>
</parent>


<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-configuration-processor</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
</dependency>
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
</dependency>

<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>

<properties>
<java.version>1.8</java.version>
</properties>

<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>

</project>
33 changes: 33 additions & 0 deletions src/main/java/com/wave/csvconverter/Application.java
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
package com.wave.csvconverter;

import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.scheduling.annotation.EnableAsync;

import com.wave.csvconverter.configuration.upload.UploadStorageProperties;
import com.wave.csvconverter.service.upload.UploadService;

@SpringBootApplication
@EnableConfigurationProperties(UploadStorageProperties.class)
@EnableAsync
public class Application {

public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}

/*
* The bean initialization cleans up all leftover artifacts
* from the previous runs
*/
@Bean
CommandLineRunner init(UploadService uploadService) {
return (args) -> {
uploadService.deleteAll();
uploadService.init();
};
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
package com.wave.csvconverter.configuration.persistence;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.core.io.FileSystemResource;
import org.springframework.stereotype.Component;

import com.wave.csvconverter.domain.EmployeeExpense;

/*
* Custom batch file reader. The read accepts the CSV file name
* for reading from the JobParameters.
* Each job/reader instance can read its own file to ensure parallel
* processing of multiple conversion batches.
*/
@Component
@StepScope
public class CSVFileReader extends FlatFileItemReader<EmployeeExpense> {

private String fileName;

private static final Logger log = LoggerFactory.getLogger(CSVFileReader.class);

/*
* The CVS file name is passed to the c'tor
*/
@Autowired
public CSVFileReader(@Value("#{jobParameters['convert.file.name']}") final String fileName) {
setFileName(fileName);
initiazliazeReader();
}

/*
* The reader is initialized with custom line mapper used to fetch records from
* the CVS file according to the required format
*/
private void initiazliazeReader() {
log.info("Initialize CSV Reader for the file '" + fileName + "'");

setResource(new FileSystemResource(fileName));
setLinesToSkip(1); // first line is title definition
setLineMapper(new DefaultLineMapper<EmployeeExpense>() {
{
setLineTokenizer(new DelimitedLineTokenizer() {
{
setNames(new String[] { "date", "category", "employee_name", "employee_address",
"expense_description", "pretax_amount", "tax_name", "tax_amount" });
}
});
setFieldSetMapper(new BeanWrapperFieldSetMapper<EmployeeExpense>() {
{
setTargetType(EmployeeExpense.class);
}
});
}
});
}

public void setFileName(String fileName) {
this.fileName = fileName;
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
package com.wave.csvconverter.configuration.persistence;

import javax.sql.DataSource;

import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.core.launch.support.SimpleJobLauncher;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider;
import org.springframework.batch.item.database.JdbcBatchItemWriter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.task.SimpleAsyncTaskExecutor;

import com.wave.csvconverter.domain.EmployeeExpense;
import com.wave.csvconverter.utils.persistence.JobCompletionNotificationListener;

/*
* The class defines batch work-flow for CSV files conversion into RDMS
*/
@Configuration
@EnableBatchProcessing
public class ConversionBatchConfiguration {

@Autowired
public JobBuilderFactory jobBuilderFactory;

@Autowired
public StepBuilderFactory stepBuilderFactory;

@Autowired
public DataSource dataSource;

@Autowired
public JobRepository jobRepository;

@Autowired
public CSVFileReader fileReader;

/*
* Override the basic synchronous Job launcher to enable
* asynchronous processing of the Job which would prevent
* web requests timeouts and improve UX
*/
@Bean
public JobLauncher asyncJobLauncher() {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository);
simpleJobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
return simpleJobLauncher;
}

/*
* Simple writer definition. The data is written into a default in memory
* relation database. The table is initialized each time the application starts
*/
@Bean
public JdbcBatchItemWriter<EmployeeExpense> writer() {
JdbcBatchItemWriter<EmployeeExpense> writer = new JdbcBatchItemWriter<EmployeeExpense>();
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<EmployeeExpense>());
writer.setSql(
"INSERT INTO employee_expenses (expense_date, category,employee_name,employee_address,expense_description,pretax_amount,tax_name,tax_amount) VALUES (:date, :category, :employee_name, :employee_address, :expense_description, :pretax_amount, :tax_name, :tax_amount)");
writer.setDataSource(dataSource);
return writer;
}

/*
* Register custom Job completion notification listener which would tell us when
* a conversion Job is done
*/
@Bean
public Job job(JobCompletionNotificationListener listener) {
return jobBuilderFactory.get("importEmployeeExpenseJob").incrementer(new RunIdIncrementer()).listener(listener)
.flow(step1()).end().build();
}

/*
* Conversion step definition
*/
@Bean
public Step step1() {
return stepBuilderFactory.get("step1").<EmployeeExpense, EmployeeExpense>chunk(1).reader(fileReader)
.writer(writer()).build();
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
package com.wave.csvconverter.configuration.upload;

import org.springframework.boot.context.properties.ConfigurationProperties;

@ConfigurationProperties("storage")
public class UploadStorageProperties {

/**
* Root folder location for temporary storage of the uploaded files
*/
private String location = "upload-dir";

public String getLocation() {
return location;
}

public void setLocation(String location) {
this.location = location;
}

}
Loading