qemu-ppc
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[PATCH 00/10] Misc ppc/mac machines clean up


From: BALATON Zoltan
Subject: [PATCH 00/10] Misc ppc/mac machines clean up
Date: Sat, 17 Sep 2022 01:07:18 +0200 (CEST)

This series includes some clean ups to mac_newworld and mac_oldworld
to make them a bit simpler and more readable, It also removes the
shared mac.h file that turns out was more of a random collection of
unrelated things. Getting rid of this mac.h improves the locality of
device models and reduces unnecessary interdependency.

BALATON Zoltan (10):
  mac_newworld: Drop some variables
  mac_oldworld: Drop some more variables
  mac_{old|new}world: Set default values for some local variables
  mac_newworld: Simplify creation of Uninorth devices
  mac_{old|new}world: Reduce number of QOM casts
  hw/ppc/mac.h: Move newworld specific atuff out from shared header
  hw/ppc/mac.h: Move macio specific atuff out from shared header
  hw/ppc/mac.h: Move grackle-pcihost declaration out from shared header
  hw/ppc/mac.h: Move PROM and KERNEL defines to board code
  hw/ppc/mac.h: Rename to include/hw/nvram/mac_nvram.h

 MAINTAINERS                   |   1 +
 hw/ide/macio.c                |   1 -
 hw/intc/heathrow_pic.c        |   1 -
 hw/intc/openpic.c             |   1 -
 hw/misc/macio/cuda.c          |   1 -
 hw/misc/macio/gpio.c          |   1 -
 hw/misc/macio/macio.c         |  27 ++++-
 hw/misc/macio/pmu.c           |   1 -
 hw/nvram/mac_nvram.c          |   2 +-
 hw/pci-host/grackle.c         |   2 +-
 hw/pci-host/uninorth.c        |   1 -
 hw/ppc/mac.h                  | 105 ----------------
 hw/ppc/mac_newworld.c         | 220 ++++++++++++++++------------------
 hw/ppc/mac_oldworld.c         | 105 +++++++---------
 include/hw/misc/macio/macio.h |   2 +-
 include/hw/nvram/mac_nvram.h  |  49 ++++++++
 16 files changed, 222 insertions(+), 298 deletions(-)
 delete mode 100644 hw/ppc/mac.h
 create mode 100644 include/hw/nvram/mac_nvram.h

-- 
2.30.4




reply via email to

[Prev in Thread] Current Thread [Next in Thread]