fix: wrong resolution count in timeseries reports (#12261)

There was a fundamental difference in how resolution counts were
calculated between the agent summary and timeseries reports, causing
confusion for users when the numbers didn't match.

The agent summary report counted all `conversation_resolved` events
within a time period by querying the `reporting_events` table directly.
However, the timeseries report had an additional constraint that
required the conversation to currently be in resolved status
(`conversations.status = 1`). This meant that if an agent resolved a
conversation that was later reopened, the resolution action would be
counted in the summary but not in the timeseries.

This fix aligns both reports to count resolution events rather than
conversations in resolved state. When an agent resolves a conversation,
they should receive credit for that action regardless of what happens to
the conversation afterward. The same logic now applies to bot
resolutions as well.

The change removes the `conversations: { status: :resolved }` condition
from both `scope_for_resolutions_count` and
`scope_for_bot_resolutions_count` methods in CountReportBuilder, and
updates the corresponding test expectations to reflect that all
resolution events are counted.


## About timezone

When a timezone is specified via `timezone_offset` parameter, the
reporting system:

1. Converts timestamps to the target timezone before grouping
2. Groups data by local day/week/month boundaries in that timezone, but
the primary boundaries are sent by the frontend and used as-is
3. Returns timestamps representing midnight in the target timezone

This means the same events can appear in different day buckets depending
on the timezone used. For summary reports, it works fine, since the user
only needs the total count between two timestamps and the frontend sends
the timestamps adjusted for timezone.

## Testing Locally

Run the following command, this will erase all data for that account and
put in 1000 conversations over last 3 months, parameters of this can be
tweaked in `Seeders::Reports::ReportDataSeeder`

I'd suggest updating the values to generate data over 30 days, with
10000 conversations, it will take it's sweet time to run but then the
data will be really rich, great for testing.

```
ACCOUNT_ID=2 ENABLE_ACCOUNT_SEEDING=true bundle exec rake db:seed:reports_data
```

Pro Tip: Don't run the app when the seeder is active, we manually create
the reporting events anyway. So once done just use `redis-cli FLUSHALL`
to clear all sidekiq jobs. Will be easier on the system

Use the following scripts to test it

- https://gist.github.com/scmmishra/1263a922f5efd24df8e448a816a06257
- https://gist.github.com/scmmishra/ca0b861fa0139e2cccdb72526ea844b2
- https://gist.github.com/scmmishra/5fe73d1f48f35422fd1fd142ea3498f3
- https://gist.github.com/scmmishra/3b7b1f9e2ff149007170e5c329432f45
- https://gist.github.com/scmmishra/f245fa2f44cd973e5d60aac64f979162

---------

Co-authored-by: Sivin Varghese <64252451+iamsivin@users.noreply.github.com>
Co-authored-by: Pranav <pranav@chatwoot.com>
Co-authored-by: Muhsin Keloth <muhsinkeramam@gmail.com>
This commit is contained in:
Shivam Mishra
2025-09-03 15:47:16 +05:30
committed by GitHub
parent 2320782f38
commit ef4e287f0d
6 changed files with 353 additions and 25 deletions

View File

@@ -28,7 +28,7 @@ class V2::Reports::LabelSummaryBuilder < V2::Reports::BaseSummaryBuilder
{
conversation_counts: fetch_conversation_counts(conversation_filter),
resolved_counts: fetch_resolved_counts(conversation_filter),
resolved_counts: fetch_resolved_counts,
resolution_metrics: fetch_metrics(conversation_filter, 'conversation_resolved', use_business_hours),
first_response_metrics: fetch_metrics(conversation_filter, 'first_response', use_business_hours),
reply_metrics: fetch_metrics(conversation_filter, 'reply_time', use_business_hours)
@@ -62,10 +62,21 @@ class V2::Reports::LabelSummaryBuilder < V2::Reports::BaseSummaryBuilder
fetch_counts(conversation_filter)
end
def fetch_resolved_counts(conversation_filter)
# since the base query is ActsAsTaggableOn,
# the status :resolved won't automatically be converted to integer status
fetch_counts(conversation_filter.merge(status: Conversation.statuses[:resolved]))
def fetch_resolved_counts
# Count resolution events, not conversations currently in resolved status
# Filter by reporting_event.created_at, not conversation.created_at
reporting_event_filter = { name: 'conversation_resolved', account_id: account.id }
reporting_event_filter[:created_at] = range if range.present?
ReportingEvent
.joins(conversation: { taggings: :tag })
.where(
reporting_event_filter.merge(
taggings: { taggable_type: 'Conversation', context: 'labels' }
)
)
.group('tags.name')
.count
end
def fetch_counts(conversation_filter)
@@ -84,9 +95,7 @@ class V2::Reports::LabelSummaryBuilder < V2::Reports::BaseSummaryBuilder
def fetch_metrics(conversation_filter, event_name, use_business_hours)
ReportingEvent
.joins('INNER JOIN conversations ON reporting_events.conversation_id = conversations.id')
.joins('INNER JOIN taggings ON taggings.taggable_id = conversations.id')
.joins('INNER JOIN tags ON taggings.tag_id = tags.id')
.joins(conversation: { taggings: :tag })
.where(
conversations: conversation_filter,
name: event_name,

View File

@@ -38,17 +38,17 @@ class V2::Reports::Timeseries::CountReportBuilder < V2::Reports::Timeseries::Bas
end
def scope_for_resolutions_count
scope.reporting_events.joins(:conversation).select(:conversation_id).where(
scope.reporting_events.where(
name: :conversation_resolved,
conversations: { status: :resolved }, created_at: range
).distinct
created_at: range
)
end
def scope_for_bot_resolutions_count
scope.reporting_events.joins(:conversation).select(:conversation_id).where(
scope.reporting_events.where(
name: :conversation_bot_resolved,
conversations: { status: :resolved }, created_at: range
).distinct
created_at: range
)
end
def scope_for_bot_handoffs_count
@@ -59,6 +59,10 @@ class V2::Reports::Timeseries::CountReportBuilder < V2::Reports::Timeseries::Bas
end
def grouped_count
# IMPORTANT: time_zone parameter affects both data grouping AND output timestamps
# It converts timestamps to the target timezone before grouping, which means
# the same event can fall into different day buckets depending on timezone
# Example: 2024-01-15 00:00 UTC becomes 2024-01-14 16:00 PST (falls on different day)
@grouped_values = object_scope.group_by_period(
group_by,
:created_at,

View File

@@ -53,13 +53,13 @@ module ReportHelper
end
def resolutions
scope.reporting_events.joins(:conversation).select(:conversation_id).where(account_id: account.id, name: :conversation_resolved,
conversations: { status: :resolved }, created_at: range).distinct
scope.reporting_events.where(account_id: account.id, name: :conversation_resolved,
created_at: range)
end
def bot_resolutions
scope.reporting_events.joins(:conversation).select(:conversation_id).where(account_id: account.id, name: :conversation_bot_resolved,
conversations: { status: :resolved }, created_at: range).distinct
scope.reporting_events.where(account_id: account.id, name: :conversation_bot_resolved,
created_at: range)
end
def bot_handoffs

View File

@@ -120,8 +120,38 @@ describe V2::ReportBuilder do
builder = described_class.new(account, params)
metrics = builder.timeseries
# 4 conversations are resolved
expect(metrics[Time.zone.today]).to be 4
# 5 resolution events occurred (even though 1 was later reopened)
expect(metrics[Time.zone.today]).to be 5
expect(metrics[Time.zone.today - 2.days]).to be 0
end
end
it 'return resolutions count with multiple resolutions of same conversation' do
travel_to(Time.zone.today) do
params = {
metric: 'resolutions_count',
type: :account,
since: (Time.zone.today - 3.days).to_time.to_i.to_s,
until: Time.zone.today.end_of_day.to_time.to_i.to_s
}
conversations = account.conversations.where('created_at < ?', 1.day.ago)
perform_enqueued_jobs do
# Resolve all 5 conversations (first round)
conversations.each(&:resolved!)
# Reopen 2 conversations and resolve them again
conversations.first(2).each do |conversation|
conversation.open!
conversation.resolved!
end
end
builder = described_class.new(account, params)
metrics = builder.timeseries
# 7 total resolution events: 5 initial + 2 re-resolutions
expect(metrics[Time.zone.today]).to be 7
expect(metrics[Time.zone.today - 2.days]).to be 0
end
end
@@ -153,10 +183,10 @@ describe V2::ReportBuilder do
metrics = builder.timeseries
summary = builder.bot_summary
# 4 conversations are resolved
expect(metrics[Time.zone.today]).to be 4
# 5 bot resolution events occurred (even though 1 was later reopened)
expect(metrics[Time.zone.today]).to be 5
expect(metrics[Time.zone.today - 2.days]).to be 0
expect(summary[:bot_resolutions_count]).to be 4
expect(summary[:bot_resolutions_count]).to be 5
end
end
@@ -339,8 +369,40 @@ describe V2::ReportBuilder do
builder = described_class.new(account, params)
metrics = builder.timeseries
# this should count only 4 since the last conversation was reopened
expect(metrics[Time.zone.today]).to be 4
# this should count all 5 resolution events (even though 1 was later reopened)
expect(metrics[Time.zone.today]).to be 5
expect(metrics[Time.zone.today - 2.days]).to be 0
end
end
it 'return resolutions count with multiple resolutions of same conversation' do
travel_to(Time.zone.today) do
params = {
metric: 'resolutions_count',
type: :label,
id: label_2.id,
since: (Time.zone.today - 3.days).to_time.to_i.to_s,
until: (Time.zone.today + 1.day).to_time.to_i.to_s
}
conversations = account.conversations.where('created_at < ?', 1.day.ago)
perform_enqueued_jobs do
# Resolve all 5 conversations (first round)
conversations.each(&:resolved!)
# Reopen 3 conversations and resolve them again
conversations.first(3).each do |conversation|
conversation.open!
conversation.resolved!
end
end
builder = described_class.new(account, params)
metrics = builder.timeseries
# 8 total resolution events: 5 initial + 3 re-resolutions
expect(metrics[Time.zone.today]).to be 8
expect(metrics[Time.zone.today - 2.days]).to be 0
end
end

View File

@@ -313,5 +313,61 @@ RSpec.describe V2::Reports::LabelSummaryBuilder do
expect(label_1_report[:avg_first_response_time]).to eq(1800.0)
end
end
context 'with resolution count with multiple resolutions of same conversation' do
let(:business_hours) { false }
let(:account2) { create(:account) }
let(:unique_label_name) { SecureRandom.uuid }
let(:test_label) { create(:label, title: unique_label_name, account: account2) }
let(:test_date) { Date.new(2025, 6, 15) }
let(:account2_builder) do
described_class.new(account: account2, params: {
business_hours: false,
since: test_date.to_time.to_i.to_s,
until: test_date.end_of_day.to_time.to_i.to_s,
timezone_offset: 0
})
end
before do
# Ensure test_label is created
test_label
travel_to(test_date) do
user = create(:user, account: account2)
inbox = create(:inbox, account: account2)
create(:inbox_member, user: user, inbox: inbox)
gravatar_url = 'https://www.gravatar.com'
stub_request(:get, /#{gravatar_url}.*/).to_return(status: 404)
perform_enqueued_jobs do
conversation = create(:conversation, account: account2,
inbox: inbox, assignee: user,
created_at: test_date)
conversation.update_labels(unique_label_name)
conversation.label_list
conversation.save!
# First resolution
conversation.resolved!
# Reopen conversation
conversation.open!
# Second resolution
conversation.resolved!
end
end
end
it 'counts multiple resolution events for same conversation' do
report = account2_builder.build
test_label_report = report.find { |r| r[:name] == unique_label_name }
expect(test_label_report).not_to be_nil
expect(test_label_report[:resolved_conversations_count]).to eq(2)
end
end
end
end

View File

@@ -0,0 +1,197 @@
require 'rails_helper'
RSpec.describe Api::V2::Accounts::ReportsController, type: :request do
let(:account) { create(:account) }
let(:admin) { create(:user, account: account, role: :administrator) }
let(:agent) { create(:user, account: account, role: :agent) }
let(:inbox) { create(:inbox, account: account) }
describe 'GET /api/v2/accounts/{account.id}/reports' do
context 'when authenticated and authorized' do
before do
# Create conversations across 24 hours at different times
base_time = Time.utc(2024, 1, 15, 0, 0) # Start at midnight UTC
# Create conversations every 4 hours across 24 hours
6.times do |i|
time = base_time + (i * 4).hours
travel_to time do
conversation = create(:conversation, account: account, inbox: inbox, assignee: agent)
create(:message, account: account, conversation: conversation, message_type: :outgoing)
end
end
end
it 'timezone_offset affects data grouping and timestamps correctly' do
Time.use_zone('UTC') do
base_time = Time.utc(2024, 1, 15, 0, 0)
base_params = {
metric: 'conversations_count',
type: 'account',
since: (base_time - 1.day).to_i.to_s,
until: (base_time + 2.days).to_i.to_s,
group_by: 'day'
}
responses = [0, -8, 9].map do |offset|
get "/api/v2/accounts/#{account.id}/reports",
params: base_params.merge(timezone_offset: offset),
headers: admin.create_new_auth_token, as: :json
response.parsed_body
end
data_entries = responses.map { |r| r.select { |e| e['value'] > 0 } }
totals = responses.map { |r| r.sum { |e| e['value'] } }
timestamps = responses.map { |r| r.map { |e| e['timestamp'] } }
# Data conservation and redistribution
expect(totals.uniq).to eq([6])
expect(data_entries[0].map { |e| e['value'] }).to eq([1, 5])
expect(data_entries[1].map { |e| e['value'] }).to eq([3, 3])
expect(data_entries[2].map { |e| e['value'] }).to eq([4, 2])
# Timestamp differences
expect(timestamps.uniq.size).to eq(3)
timestamps[0].zip(timestamps[1]).each { |utc, pst| expect(utc - pst).to eq(-28_800) }
end
end
describe 'timezone_offset does not affect summary report totals' do
let(:base_time) { Time.utc(2024, 1, 15, 12, 0) }
let(:summary_params) do
{
type: 'account',
since: (base_time - 1.day).to_i.to_s,
until: (base_time + 1.day).to_i.to_s
}
end
let(:jst_params) do
# For JST: User wants "Jan 15 JST" which translates to:
# Jan 14 15:00 UTC to Jan 15 15:00 UTC (event NOT included)
{
type: 'account',
since: (Time.utc(2024, 1, 15, 0, 0) - 9.hours).to_i.to_s, # Jan 14 15:00 UTC
until: (Time.utc(2024, 1, 16, 0, 0) - 9.hours).to_i.to_s # Jan 15 15:00 UTC
}
end
let(:utc_params) do
# For UTC: Jan 15 00:00 UTC to Jan 16 00:00 UTC (event included)
{
type: 'account',
since: Time.utc(2024, 1, 15, 0, 0).to_i.to_s,
until: Time.utc(2024, 1, 16, 0, 0).to_i.to_s
}
end
it 'returns identical conversation counts across timezones' do
Time.use_zone('UTC') do
summaries = [-8, 0, 9].map do |offset|
get "/api/v2/accounts/#{account.id}/reports/summary",
params: summary_params.merge(timezone_offset: offset),
headers: admin.create_new_auth_token, as: :json
response.parsed_body
end
conversation_counts = summaries.map { |s| s['conversations_count'] }
expect(conversation_counts.uniq).to eq([6])
end
end
it 'returns identical message counts across timezones' do
Time.use_zone('UTC') do
get "/api/v2/accounts/#{account.id}/reports/summary",
params: summary_params.merge(timezone_offset: 0),
headers: admin.create_new_auth_token, as: :json
utc_summary = response.parsed_body
get "/api/v2/accounts/#{account.id}/reports/summary",
params: summary_params.merge(timezone_offset: -8),
headers: admin.create_new_auth_token, as: :json
pst_summary = response.parsed_body
expect(utc_summary['incoming_messages_count']).to eq(pst_summary['incoming_messages_count'])
expect(utc_summary['outgoing_messages_count']).to eq(pst_summary['outgoing_messages_count'])
end
end
it 'returns consistent resolution counts across timezones' do
Time.use_zone('UTC') do
get "/api/v2/accounts/#{account.id}/reports/summary",
params: summary_params.merge(timezone_offset: 0),
headers: admin.create_new_auth_token, as: :json
utc_summary = response.parsed_body
get "/api/v2/accounts/#{account.id}/reports/summary",
params: summary_params.merge(timezone_offset: 9),
headers: admin.create_new_auth_token, as: :json
jst_summary = response.parsed_body
expect(utc_summary['resolutions_count']).to eq(jst_summary['resolutions_count'])
end
end
it 'returns consistent previous period data across timezones' do
Time.use_zone('UTC') do
get "/api/v2/accounts/#{account.id}/reports/summary",
params: summary_params.merge(timezone_offset: 0),
headers: admin.create_new_auth_token, as: :json
utc_summary = response.parsed_body
get "/api/v2/accounts/#{account.id}/reports/summary",
params: summary_params.merge(timezone_offset: -8),
headers: admin.create_new_auth_token, as: :json
pst_summary = response.parsed_body
expect(utc_summary['previous']['conversations_count']).to eq(pst_summary['previous']['conversations_count']) if utc_summary['previous']
end
end
it 'summary reports work when frontend sends correct timezone boundaries' do
Time.use_zone('UTC') do
# Create a resolution event right at timezone boundary
boundary_time = Time.utc(2024, 1, 15, 23, 30) # 11:30 PM UTC on Jan 15
gravatar_url = 'https://www.gravatar.com'
stub_request(:get, /#{gravatar_url}.*/).to_return(status: 404)
travel_to boundary_time do
perform_enqueued_jobs do
conversation = create(:conversation, account: account, inbox: inbox, assignee: agent)
conversation.resolved!
end
end
get "/api/v2/accounts/#{account.id}/reports/summary",
params: jst_params.merge(timezone_offset: 9),
headers: admin.create_new_auth_token, as: :json
jst_summary = response.parsed_body
get "/api/v2/accounts/#{account.id}/reports/summary",
params: utc_params.merge(timezone_offset: 0),
headers: admin.create_new_auth_token, as: :json
utc_summary = response.parsed_body
expect(jst_summary['resolutions_count']).to eq(0)
expect(utc_summary['resolutions_count']).to eq(1)
end
end
end
end
context 'when unauthenticated' do
it 'returns unauthorized' do
get "/api/v2/accounts/#{account.id}/reports"
expect(response).to have_http_status(:unauthorized)
end
end
context 'when authenticated but not authorized' do
it 'returns forbidden' do
get "/api/v2/accounts/#{account.id}/reports",
headers: agent.create_new_auth_token,
as: :json
expect(response).to have_http_status(:unauthorized)
end
end
end
end