Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[92] Evaluation: Submission Scoring Completion #387

Open
wants to merge 35 commits into
base: dev
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
5fe54a4
90 Updates to eval form rendering and validation
cpreisinger Jan 20, 2025
6a4ab8b
Merge branch 'dev' of github.com:GSA/Challenge_platform into 90/submi…
cpreisinger Jan 27, 2025
7ee0ddc
71 Initial evaluation score calculation
cpreisinger Jan 28, 2025
b6a6811
71 Remove debug statement in eval scores
cpreisinger Jan 28, 2025
bf75c77
71 Fix issue with duplicate criterion scores
cpreisinger Jan 28, 2025
20f9d8f
Merge branch 'dev' of github.com:GSA/Challenge_platform into 71/evalu…
cpreisinger Jan 28, 2025
39de63f
71 Fix codeclimate issues
cpreisinger Jan 29, 2025
2ba5077
71 Change rating and binary to use same rendering
cpreisinger Jan 29, 2025
8809129
Merge branch 'dev' of github.com:GSA/Challenge_platform into 71/evalu…
cpreisinger Jan 31, 2025
328d373
71 PR feedback. Incorrect redirect, typos
cpreisinger Jan 31, 2025
3a86625
71 Fix issue with evaluation total score calc
cpreisinger Feb 2, 2025
d141051
71 Remove debug statement
cpreisinger Feb 3, 2025
e27ca70
92 Initial styling of evaluation form
cpreisinger Feb 5, 2025
60009d4
Merge branch 'dev' of github.com:GSA/Challenge_platform into 92/evalu…
cpreisinger Feb 5, 2025
2524e58
92 Fix spec failures
cpreisinger Feb 5, 2025
e28e569
92 Code climate fixes
cpreisinger Feb 5, 2025
8c68a24
Merge branch 'dev' of github.com:GSA/Challenge_platform into 92/evalu…
cpreisinger Feb 5, 2025
645fe55
Merge branch 'dev' of github.com:GSA/Challenge_platform into 92/evalu…
cpreisinger Feb 6, 2025
2fa5479
92 Auto updating scores and total score for eval
cpreisinger Feb 6, 2025
2aac156
92 Add confirmation page after draft/complete save
cpreisinger Feb 6, 2025
26cca35
92 Fix specs related to confirmation page
cpreisinger Feb 6, 2025
099aff1
92 Refactor eval score stimulus for code climate
cpreisinger Feb 6, 2025
4172dc4
92 Confirmation modal and additional modal func
cpreisinger Feb 7, 2025
03a4b89
Merge branch 'dev' of github.com:GSA/Challenge_platform into 92/evalu…
cpreisinger Feb 7, 2025
6adbbb3
92 Fix complexity of confirm modal function
cpreisinger Feb 7, 2025
e9ffabd
92 Remove func from eval controller for rubocop
cpreisinger Feb 7, 2025
865cba7
92 Add disabled functionality to eval form
cpreisinger Feb 10, 2025
dd5763a
92 Init specs Fix save draft issue Factory updates
cpreisinger Feb 11, 2025
0722b6e
Merge branch 'dev' of github.com:GSA/Challenge_platform into 92/evalu…
cpreisinger Feb 11, 2025
40ed237
92 Fix issue with inline errors and hotdog layout
cpreisinger Feb 11, 2025
b2ea543
92 Reduce evaluation_params complexity
cpreisinger Feb 11, 2025
3d9b461
Merge branch 'dev' of github.com:GSA/Challenge_platform into 92/evalu…
cpreisinger Feb 11, 2025
7e847d1
92 Simplify inline_error function for codeclimate
cpreisinger Feb 11, 2025
59f1705
92 Add happy path spec test/helpers for evaluation
cpreisinger Feb 12, 2025
2a1c34a
92 Move evaluator recusal to a service
cpreisinger Feb 12, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
94 changes: 40 additions & 54 deletions app/controllers/evaluations_controller.rb
Original file line number Diff line number Diff line change
@@ -1,13 +1,10 @@
# frozen_string_literal: true

# TODO: Reenable rubocop after refactor/shortening controller code or moving some functionality into service
# rubocop:disable Metrics/ClassLength

# Controller for evaluations CRUD actions.
class EvaluationsController < ApplicationController
class EvaluationsController < ApplicationController # rubocop:disable Metrics/ClassLength
before_action -> { authorize_user('evaluator') }
before_action :set_evaluation_and_submission_assignment, only: %i[create update]
before_action :set_phase, only: [:submissions]

def index
@phases = Phase.joins(:evaluator_submission_assignments).
Expand All @@ -20,6 +17,12 @@ def index
end

def submissions
@phase = Phase.joins(:challenge_phases_evaluators).
where(challenge_phases_evaluators: { user_id: current_user.id }).
find(params[:id])

@challenge = @phase.challenge

@assigned_submissions = @phase.evaluator_submission_assignments.
where(evaluator: current_user).
where(status: %i[assigned recused]).
Expand All @@ -29,6 +32,11 @@ def submissions
@submissions_count = helpers.calculate_submissions_count(@assigned_submissions)
end

def confirmation
@evaluation = Evaluation.find(params[:id])
@subaction = params[:subaction]
end

def new
fetch_evaluator_submission_assignment

Expand Down Expand Up @@ -64,7 +72,7 @@ def create
I18n.t("evaluations.notices.saved_draft")
end

redirect_to submissions_evaluation_path(@evaluation.submission.phase_id)
redirect_to confirmation_evaluation_path(@evaluation, subaction: params[:subaction])
else
render :show, status: :unprocessable_entity
end
Expand All @@ -79,18 +87,22 @@ def update
I18n.t("evaluations.notices.saved_draft")
end

redirect_to submissions_evaluation_path(@evaluation.submission.phase_id)
redirect_to confirmation_evaluation_path(@evaluation, subaction: params[:subaction])
else
render :show, status: :unprocessable_entity
end
end

def recuse
@evaluation = Evaluation.find_by(id: params[:id], user_id: current_user.id)
fetch_evaluator_submission_assignment
return unauthorized_redirect unless can_access_evaluation?
@evaluator_submission_assignment =
current_user.evaluator_submission_assignments.where(submission_id: params[:submission_id]).first

process_recusal
if EvaluatorRecusalService.new(@evaluator_submission_assignment).call
flash[:notice] = I18n.t("evaluations.recusal.success")
redirect_to submissions_evaluation_path(@evaluator_submission_assignment.phase), status: :see_other
else
unauthorized_redirect
end
end

private
Expand Down Expand Up @@ -119,13 +131,6 @@ def set_evaluation_and_submission_assignment
unauthorized_redirect unless can_access_evaluation?
end

def set_phase
@phase = Phase.joins(:challenge_phases_evaluators).
where(challenge_phases_evaluators: { user_id: current_user.id }).
find(params[:id])
@challenge = @phase.challenge
end

def find_or_initialize_evaluation
if params[:id]
Evaluation.includes([evaluation_scores: :evaluation_criterion]).find(params[:id])
Expand Down Expand Up @@ -159,51 +164,32 @@ def build_evaluation
end
end

def recuse_evaluator
@evaluator_submission_assignment&.update(status: :recused)
end

def destroy_recused_evaluation
@evaluator_submission_assignment.evaluation&.destroy!
end

# Redirect Helpers
def unauthorized_redirect
redirect_to evaluations_path, alert: I18n.t("evaluations.alerts.unauthorized")
end

def evaluation_params
params.require(:evaluation).permit(
:user_id,
:evaluator_submission_assignment_id,
:submission_id,
:evaluation_form_id,
:additional_comments,
:revision_comments,
evaluation_scores_attributes: %i[
id evaluation_criterion_id
score score_override
comment comment_override
]
)
end
normalize_evaluation_scores_keys!

def process_recusal
if recuse_evaluator
destroy_recused_evaluation
flash[:notice] = I18n.t("evaluations.recusal.success")
redirect_to submissions_evaluation_path(@evaluator_submission_assignment.phase), status: :see_other
else
handle_recusal_failure
end
rescue ActiveRecord::RecordInvalid
handle_recusal_failure
permitted_attributes = if @evaluation&.completed_at.present?
%i[revision_comments] + [{ evaluation_scores_attributes: %i[id score_override
comment_override] }]
else
%i[user_id evaluator_submission_assignment_id submission_id evaluation_form_id
additional_comments revision_comments] +
[{ evaluation_scores_attributes: %i[id evaluation_criterion_id score score_override
comment comment_override] }]
end

params.require(:evaluation).permit(*permitted_attributes)
end

def handle_recusal_failure
flash[:alert] = I18n.t("evaluations.recusal.failure")
redirect_to submissions_evaluation_path(@evaluator_submission_assignment.phase), status: :see_other
# Normalize random hex keys to integer indexes rails understands for nested_attributes
def normalize_evaluation_scores_keys!
return if params.dig(:evaluation, :evaluation_scores_attributes).blank?

params[:evaluation][:evaluation_scores_attributes] =
params[:evaluation][:evaluation_scores_attributes].transform_keys.with_index { |_key, index| index.to_s }
end
end
# TODO: Remove this after above refactor
# rubocop:enable Metrics/ClassLength
74 changes: 74 additions & 0 deletions app/helpers/evaluation_scores_helper.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# frozen_string_literal: true

# View helpers for evaluation score form inputs
module EvaluationScoresHelper
def evaluation_score_id(_form, attribute, identifier)
prefix = "evaluation_evaluation_scores_attributes"

"#{prefix}_#{identifier}_#{attribute}"
end

def evaluation_score_name(_form, attribute, identifier)
prefix = "evaluation[evaluation_scores_attributes]"

"#{prefix}[#{identifier}][#{attribute}]"
end

def evaluation_score_input(score_fields, criterion, identifier, disabled)
field_id = evaluation_score_id(score_fields, :score, identifier)
field_name = evaluation_score_name(score_fields, :score, identifier)

case criterion.scoring_type
when 'numeric'
score_numeric_input(score_fields, field_id, field_name, disabled)
else # rating or binary
content_tag(:div, class: "usa-fieldset") do
score_options(criterion).each do |value, label|
concat(score_radio_input(score_fields, value, label, id: field_id, name: field_name, disabled:))
end
end
end
end

def score_numeric_input(score_fields, id, name, disabled)
criterion = score_fields.object.evaluation_criterion
min = 0
max = criterion.points_or_weight

content_tag(:div, class: "display-flex flex-column") do
# TODO: Should the lowest be 1?
concat(score_fields.label(:score, "Enter a number between #{min} and #{max}", for: id))
concat(score_fields.number_field(
:score, id:, name:, min:, max:, class: "usa-input width-10",
data: {
'evaluation-score-target': "scoreInput",
action: "input->evaluation-score#calculateScore"
}, disabled:
))
end
end

def score_options(criterion)
(criterion.option_range_start..criterion.option_range_end).map do |value|
[value, criterion.option_labels[value.to_s] || value]
end
end

def score_radio_input(score_fields, value, label, opts = {})
id = opts[:id]
name = opts[:name]
disabled = opts[:disabled]

content_tag(:div, class: "usa-radio") do
concat(score_fields.radio_button(
:score, value, id: "#{id}_#{value}", name:, class: "usa-radio__input usa-radio__input--tile",
data: {
'evaluation-score-target': "scoreInput",
action: "change->evaluation-score#calculateScore"
},
disabled:
))
concat(score_fields.label("score_#{value}", label, for: "#{id}_#{value}", class: "usa-radio__label"))
end
end
end
27 changes: 2 additions & 25 deletions app/helpers/evaluations_helper.rb
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
# frozen_string_literal: true

# rubocop:disable Metrics/ModuleLength
# View helpers for calculating evaluation & submission details.
module EvaluationsHelper
STATUS_COLORS = {
Expand Down Expand Up @@ -123,29 +122,7 @@ def evaluation_link(assignment)
link_to("Evaluate", link_path, class: "usa-button font-body-2xs width-full text-no-wrap")
end

def evaluation_score_input(score_fields, criterion)
content_tag(:div) do
case criterion.scoring_type
when 'numeric'
score_fields.number_field(:score, min: 0, max: criterion.points_or_weight)
# When rating or binary. Maybe change later if input styles are different
else
score_options(criterion).each { |value, label| concat(score_radio_input(score_fields, value, label)) }
end
end
end

def score_options(criterion)
(criterion.option_range_start..criterion.option_range_end).map do |value|
[value, criterion.option_labels[value.to_s] || value]
end
end

def score_radio_input(score_fields, value, label)
content_tag(:div) do
concat(score_fields.radio_button(:score, value))
concat(score_fields.label("score_#{value}", label))
end
def form_disabled?(evaluation)
evaluation.completed_at
end
end
# rubocop:enable Metrics/ModuleLength
25 changes: 21 additions & 4 deletions app/helpers/form_helper.rb
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,28 @@ def input_error_class(form, fields)
Array(fields).any? { |field| object.errors[field].present? } ? "border-secondary" : ""
end

def inline_error(form, field)
object = form.object
field_id = (form.object_name + "_#{field}_error").gsub(/[\[\]]/, "_").squeeze('_')
error = object.errors[field].present? ? object.errors[field].join(", ") : ""
def inline_error(form, field, identifier = nil)
object_name = formatted_object_name(form, identifier)
field_id = "#{normalize_field_name(object_name, field)}_error"
error = form.object.errors[field].presence&.join(", ") || ""

tag.span(error, class: "text-secondary font-body-2xs", id: field_id)
end

private

def formatted_object_name(form, identifier)
object_name = form.object_name
return object_name if identifier.blank?

if form.options[:child_index].present?
object_name.sub(/\[\d+\]$/, "[#{identifier}]")
else
"#{object_name}[#{identifier}]"
end
end

def normalize_field_name(object_name, field)
"#{object_name}_#{field}".gsub(/[\[\]]/, "_").squeeze("_")
end
end
21 changes: 21 additions & 0 deletions app/javascript/controllers/evaluation_controller.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
import { Controller } from "@hotwired/stimulus";

export default class extends Controller {
static targets = ["calculatedScore", "totalScore"];

connect() {
this.updateTotalScore();
}

updateTotalScore() {
let totalScore = 0;

this.calculatedScoreTargets.forEach((span) => {
totalScore += parseFloat(span.textContent) || 0;
});

if (this.hasTotalScoreTarget) {
this.totalScoreTarget.textContent = totalScore.toFixed(2);
}
}
}
Loading
Loading