-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature: run a clean-up job when uninstalling #3339
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #3339 +/- ##
=======================================
Coverage 81.31% 81.31%
=======================================
Files 50 50
Lines 4352 4352
=======================================
Hits 3539 3539
Misses 661 661
Partials 152 152
Flags with carried forward coverage won't be shown. Click here to find out more. |
953fa75
to
b17898e
Compare
could it be tested in a E2E case ? |
|
||
err = c.Delete(ctx, &item) | ||
if err != nil { | ||
return err |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for cleaning job, it should not be interrupted, and clean as much as possible
e2440a5
to
06ebb91
Compare
should look like this:
|
} | ||
} | ||
|
||
return nil |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
中间有哪个清除失败了,最终返回错误,反应在 job 的 状态中
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
8ca3e54
to
d715c4d
Compare
if err != nil { | ||
return fmt.Errorf("failed to update spiderEndpoint's: %v finalizers to be empty: %v ", &item, err) | ||
} | ||
logger.Sugar().Infof("succeeded to update spiderEndpoint's: %v finalizers to be empty", &item) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
succeeded to clean the finalizers of spiderEndpoint xxxx
d715c4d
to
181c40a
Compare
@@ -757,6 +757,10 @@ spiderpoolController: | |||
## @param spiderpoolController.tls.auto.extraDnsNames extra DNS names of server cert for auto method | |||
extraDnsNames: [] | |||
|
|||
cleanup: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mention it in the document of uninstalling
if err == nil { | ||
err := c.Delete(ctx, mObj) | ||
if err != nil { | ||
return fmt.Errorf("failed to delete MutatingWebhookConfiguration: %v.", err) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
卸载是尽最大可能删除东西,任何删除失败的动作,不要 return ,而是 日志后 continue 继续删除其他
@cyclinder 现在垃圾资源的清除,除了 Spiderpool 本身,其它 项目 的垃圾资源 是否还需要 帮助删除 ? |
181c40a
to
356cbf8
Compare
477e449
to
fa006c4
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Headings can be shorter and better
docs/usage/install/uninstall.md
Outdated
|
||
Replace `<spiderpool-name>` with the name of the Spiderpool you want to uninstall and `<spiderpool-namespace>` with the namespace of the Spiderpool. | ||
|
||
### In versions higher than 0.9.2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Above v0.9.2
docs/usage/install/uninstall.md
Outdated
kubectl get spidercoordinators.spiderpool.spidernet.io -o name | wc -l | ||
``` | ||
|
||
### In versions lower than 0.9.2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
### In versions lower than 0.9.2 | |
### Below v0.9.2 |
|
||
将 `<spiderpool-name>` 替换为要卸载的 Spiderpool 的名称,将 `<spiderpool-namespace>` 替换为 Spiderpool 所在的命名空间。 | ||
|
||
### 在高于 0.9.2 的版本 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
### 在高于 0.9.2 的版本 | |
### 0.9.2 以上版本 |
kubectl get spidercoordinators.spiderpool.spidernet.io -o name | wc -l | ||
``` | ||
|
||
### 在低于 0.9.2 的版本 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
### 在低于 0.9.2 的版本 | |
### 0.9.2 以下版本 |
fa006c4
to
b14de44
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you update your PR description to list all the resources you clean? I think we should clean up the below resources in order:
- All the CRD instances: IPPools,Subnets,SpiderMultusConfig...
- The CRD definitions.
- Pods that have not been completely deleted, Example sriov-network-config-daemon, see clean sriov pod #2822
updated,Step 2 Do it happily. But step 3, what if the customer installed it themselves? Is there any way we can know whether a resource like sirov was installed via spiderpool? |
|
||
### 0.9.2 以上版本 | ||
|
||
在 0.9.2 之后引入了自动清理 Spiderpool 资源的功能,它通过 `spiderpoolController.cleanup.enabled` 配置项来启用,该值默认为 `true`,您可以通过如下方式验证与 Spiderpool 相关的资源数量是否自动被清理。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
为什么是 0.9.2 , 而不是 下一个 0.10 什么的版本
这个也不会进 0.9.3
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
b14de44
to
ef39f9a
Compare
Signed-off-by: tao.yang <[email protected]>
ef39f9a
to
a9fc861
Compare
Thanks for contributing!
What type of PR is this?
What this PR does / why we need it:
close #2653
Which issue(s) this PR fixes:
Fixes #
Special notes for your reviewer: