Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes #36736 - Correct memory fact on HW properties card #10737

Closed
wants to merge 1 commit into from

Conversation

chris1984
Copy link
Member

@chris1984 chris1984 commented Sep 14, 2023

What are the changes introduced in this pull request?

Considerations taken when implementing this change?

  • N/A

What are the testing steps for this pull request?

  • Register a client with 4GB of ram and it should show 5GB (might work with libvirt if not I can give you a test client and test devel box to try it against)
  • Apply PR
  • Refresh the page and it should show the correct amount.

@theforeman-bot
Copy link

Issues: #36736

@@ -43,7 +43,7 @@ const HwPropertiesCard = ({ isExpandedGlobal, hostDetails }) => {
const coreSocket = facts?.['cpu::core(s)_per_socket'];
const reportedFacts = propsToCamelCase(hostDetails?.reported_data || {});
const totalDisks = reportedFacts?.disksTotal;
const memory = facts?.['dmi::memory::maximum_capacity'];
const memory = facts?.['dmi::memory::size'];
Copy link
Member

@sbernhard sbernhard Sep 18, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think, this is wrong and should stay at maximum_capacity:

sub-man facts:
image

puppet facts:

image

free -m
image

3GB is still more correct than 1GB.

Copy link
Member Author

@chris1984 chris1984 Sep 18, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about having a max capacity field and then a system_total field?

Copy link
Member Author

@chris1984 chris1984 Sep 18, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the user does not have puppet facts, then I guess we just only show max capacity and rename that field?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the question is, is it about "hardware properties" as the headline describes or hardware resources the OS can handle?
There are some other facts, e.g. from subscription manager.
Maybe just use what is already there:
https://github.com/theforeman/foreman/blob/31956e5267cc74b7e06eec73042a24505ba45c6f/app/services/katello/rhsm_fact_parser.rb#L107

or

return host.facts["memory::memtotal"] ? $scope.convertMemToGB(host.facts["memory::memtotal"]) : "";

this is what users expect I guess.

@chris1984
Copy link
Member Author

@sbernhard fixed, let me know if this looks better?
prmemory

Copy link
Member

@ekohl ekohl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this comes from facts, why isn't this in Foreman itself? And everything should be stored as reported_data (see https://github.com/theforeman/foreman/blob/develop/app/models/host_facets/reported_data_facet.rb). Direct use of facts is something we should avoid.

@@ -43,7 +44,7 @@ const HwPropertiesCard = ({ isExpandedGlobal, hostDetails }) => {
const coreSocket = facts?.['cpu::core(s)_per_socket'];
const reportedFacts = propsToCamelCase(hostDetails?.reported_data || {});
const totalDisks = reportedFacts?.disksTotal;
const memory = facts?.['dmi::memory::maximum_capacity'];
const memory = facts?.['memory::memtotal'];
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have this as a reported fact called ram.

Suggested change
const memory = facts?.['memory::memtotal'];
const memory = reportedFacts?.ram;

@@ -43,7 +44,7 @@ const HwPropertiesCard = ({ isExpandedGlobal, hostDetails }) => {
const coreSocket = facts?.['cpu::core(s)_per_socket'];
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is reportedFacts?.cores. I can't comment on it, but above there's also reportedFacts?.sockets

@chris1984
Copy link
Member Author

chris1984 commented Sep 21, 2023

If this comes from facts, why isn't this in Foreman itself? And everything should be stored as reported_data (see https://github.com/theforeman/foreman/blob/develop/app/models/host_facets/reported_data_facet.rb). Direct use of facts is something we should avoid.

So this would need a pr in Foreman then to have that fact come as a reported value then use that here? I'll see what ram is coming back with them and if it's not correct make a PR to foreman.

@ekohl
Copy link
Member

ekohl commented Sep 21, 2023

IMHO yes, though I'd even say that this entire card should be dropped from Katello and implemented in Foreman itself. I see no reason why it would be Katello-specific, except that it uses RHSM facts directly instead of normalizing.

<DescriptionListDescription>{memory}</DescriptionListDescription>
<DescriptionListDescription>{NumberToHumanSize(memory*1024, {
strip_insignificant_zeros: true, precision: 2
})}</DescriptionListDescription>
</DescriptionListGroup>
<DescriptionListGroup>
<HostDisks totalDisks={totalDisks} />
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just noticed this on the screenshot. This value is the total number of bytes of all disks combined, not the number of disks.

Copy link
Member Author

@chris1984 chris1984 Sep 21, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have that fixed in a separate pr

#10750

@chris1984 chris1984 closed this Sep 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants