title | description | slug | authors | |
---|---|---|---|---|
Tipsrundan 47 |
git, data and it security sprinkled with some architecture - sounds good? Then dive in! 🤿 P.S. this is the final Tipsrundan before Christmas, see you next year! |
47 |
|
👋 Welcome to Tipsrundan! A biweekly newsletter by AFRY IT South with ❤️
git, data and it security sprinkled with some architecture - sounds good? Then dive in! 🤿 P.S. this is the final Tipsrundan before Christmas, see you next year!
Do you need to restore files from staging in git?
We have working directory
, staging
and commited files if you've used git add
and have something currently in staging
and then do more changes.
This is sometimes a bit "blocking" as you want to revert to what we had in staging
and you're not sure how...
That's when git restore
comes to the rescue!
The description from git-scm.com:
Restore specified paths in the working tree with some contents from a restore source. If a path is tracked but does not exist in the restore source, it will be removed to match the source.
The command can also be used to restore the content in the index with --staged, or restore both the working tree and the index with --staged --worktree.
By default, if --staged is given, the contents are restored from HEAD, otherwise from the index. Use --source to restore from a different commit.
For more information head to git-scm.com/docs/git-restore!
Make sure to not miss out on the Lightning Talk by Oscar Carlsson (IT Syd) on Wednesday 12:00 (15/12 2021)!
Minor description of the presentation:
My name is Oscar Carlsson and I have almost 10 years’ experience working with IT security of which 5 was spent identifying and exploiting vulnerabilities for companies in a wide range of sizes and sectors.
In this presentation I will perform a short introduction and demonstration of some of the most common vulnerabilities used by hackers on the Internet.
The presentation will try to keep it as non-technical as possible in order for everyone regardless of prior knowledge can join in.
Oscar Carlsson and IT Sec, name a more iconic duo at IT Syd.
Oscar introduced me to a really cool tool called SQLMap which tries to automatically hack a database by using heuristics and smart features to pen-test a DB.
Really, it's pretty amazing. Make sure to check it out if you like compsec!
Did you miss out on Ho Ho Holy Data? Fear not!
Hampus has done a very rough summary of the event and is happy to discuss more in details with the interested ones!
Architecture, bland and gray according to many. When is the Revolution coming?
Historically many beautiful buildings has been built, not so much anymore according to many.
Nathan argues that "We need to build places we can’t stop looking at. It will involve lots of plants." and I think I agree.
The article itself lays down some good points and has beautiful pictures of places you might wish to travel once possible.
If you like architecture and good reading make sure to view this post!
StackOverflow are thrilled to announce a new and foundational feature, Content Health, that helps to intelligently identify and surface potentially outdated or inaccurate knowledge—content that needs to change.
I think it's a great initiative to keep data fresh and something that's dearly needed.
As libraries launch breaking versions it keeps getting harder to find the right functions, parameters or even API calls.
What do you think about the Content Health features?
TLDR: The zero-copy integration between DuckDB and Apache Arrow allows for rapid analysis of larger than memory datasets in Python and R using either SQL or relational APIs.
A few of you have seen DuckDB mentioned previously, and there's a good reason as of why.
DuckDB is SQLite for columnar data. It's an amazing piece of technology.
For those that are not aware of (Apache) Arrow it's a project that is a in-memory data format optimized for analytical libraries.
It's blazing fast and can optimize column-data like crazy. Combining it with Parquet makes things even better..!
By combining DuckDB and Arrow you get an amazing result that speaks for itself in terms of performance.
Allowing streaming data can reduce total peak memory by infinite essentially (0.3GB vs 248GB in example) and speed by magnitudes (showcasing benchmark with 11x-3000x faster depending on use-case and filter).
Make sure to check this new zero-copy integration out!
Thank you for this time see you in two weeks
- Hampus Londögård @ AFRY IT South