From nobody Mon Dec 23 13:47:21 2024 Delivered-To: importer@patchew.org Authentication-Results: mx.zohomail.com; spf=none (zoho.com: 198.145.21.10 is neither permitted nor denied by domain of lists.01.org) smtp.mailfrom=edk2-devel-bounces@lists.01.org Return-Path: Received: from ml01.01.org (ml01.01.org [198.145.21.10]) by mx.zohomail.com with SMTPS id 1517474239564849.5460434117962; Thu, 1 Feb 2018 00:37:19 -0800 (PST) Received: from [127.0.0.1] (localhost [IPv6:::1]) by ml01.01.org (Postfix) with ESMTP id 1C02621E0B9F4; Thu, 1 Feb 2018 00:31:29 -0800 (PST) Received: from smtp.nue.novell.com (smtp.nue.novell.com [195.135.221.5]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by ml01.01.org (Postfix) with ESMTPS id A6F1821E0B9FA for ; Thu, 1 Feb 2018 00:31:25 -0800 (PST) Received: from localhost.localdomain (unknown.telstraglobal.net [134.159.103.118]) by smtp.nue.novell.com with ESMTP (NOT encrypted); Thu, 01 Feb 2018 09:36:59 +0100 X-Original-To: edk2-devel@lists.01.org Received-SPF: none (zoho.com: 198.145.21.10 is neither permitted nor denied by domain of lists.01.org) client-ip=198.145.21.10; envelope-from=edk2-devel-bounces@lists.01.org; helo=ml01.01.org; Received-SPF: Pass (sender SPF authorized) identity=mailfrom; client-ip=195.135.221.5; helo=smtp.nue.novell.com; envelope-from=glin@suse.com; receiver=edk2-devel@lists.01.org From: Gary Lin To: edk2-devel@lists.01.org Date: Thu, 1 Feb 2018 16:35:59 +0800 Message-Id: <20180201083608.16036-12-glin@suse.com> X-Mailer: git-send-email 2.16.1 In-Reply-To: <20180201083608.16036-1-glin@suse.com> References: <20180201083608.16036-1-glin@suse.com> Subject: [edk2] [PATCH v2 11/20] BaseTools: Adjust the spaces around commas and colons X-BeenThere: edk2-devel@lists.01.org X-Mailman-Version: 2.1.23 Precedence: list List-Id: EDK II Development List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Cc: Liming Gao MIME-Version: 1.0 Content-Transfer-Encoding: quoted-printable Errors-To: edk2-devel-bounces@lists.01.org Sender: "edk2-devel" X-ZohoMail: RSF_4 Z_629925259 SPT_0 Content-Type: text/plain; charset="utf-8" Based on "futurize -f lib2to3.fixes.fix_ws_comma" Contributed-under: TianoCore Contribution Agreement 1.1 Cc: Yonghong Zhu Cc: Liming Gao Signed-off-by: Gary Lin --- BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py | = 2 +- BaseTools/Scripts/BinToPcd.py | = 8 +- BaseTools/Scripts/MemoryProfileSymbolGen.py | = 6 +- BaseTools/Scripts/PatchCheck.py | = 2 +- BaseTools/Scripts/RunMakefile.py | = 2 +- BaseTools/Source/Python/AutoGen/AutoGen.py | = 54 +++--- BaseTools/Source/Python/AutoGen/GenMake.py | = 4 +- BaseTools/Source/Python/AutoGen/GenPcdDb.py | 1= 14 ++++++------- BaseTools/Source/Python/AutoGen/GenVar.py | 1= 64 +++++++++--------- BaseTools/Source/Python/BPDG/GenVpd.py | = 12 +- BaseTools/Source/Python/Common/DataType.py | = 4 +- BaseTools/Source/Python/Common/DscClassObject.py | = 2 +- BaseTools/Source/Python/Common/EdkIIWorkspace.py | = 2 +- BaseTools/Source/Python/Common/Expression.py | = 6 +- BaseTools/Source/Python/Common/FdfParserLite.py | = 12 +- BaseTools/Source/Python/Common/Misc.py | = 46 ++--- BaseTools/Source/Python/Common/RangeExpression.py | = 4 +- BaseTools/Source/Python/Common/String.py | = 2 +- BaseTools/Source/Python/Common/VpdInfoFile.py | = 10 +- BaseTools/Source/Python/Ecc/CParser.py | = 28 +-- BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | = 14 +- BaseTools/Source/Python/Eot/CParser.py | = 28 +-- BaseTools/Source/Python/Eot/c.py | = 20 +-- BaseTools/Source/Python/GenFds/AprioriSection.py | = 2 +- BaseTools/Source/Python/GenFds/CapsuleData.py | = 2 +- BaseTools/Source/Python/GenFds/EfiSection.py | = 6 +- BaseTools/Source/Python/GenFds/Fd.py | = 6 +- BaseTools/Source/Python/GenFds/FdfParser.py | = 26 +-- BaseTools/Source/Python/GenFds/FfsInfStatement.py | = 12 +- BaseTools/Source/Python/GenFds/Fv.py | = 4 +- BaseTools/Source/Python/GenFds/FvImageSection.py | = 4 +- BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py | = 4 +- BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py | = 2 +- BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py | = 2 +- BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py | = 2 +- BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py | = 6 +- BaseTools/Source/Python/TargetTool/TargetTool.py | = 12 +- BaseTools/Source/Python/Trim/Trim.py | = 14 +- BaseTools/Source/Python/UPT/Core/DependencyRules.py | = 8 +- BaseTools/Source/Python/UPT/Core/IpiDb.py | = 4 +- BaseTools/Source/Python/UPT/Library/String.py | = 2 +- BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py | = 2 +- BaseTools/Source/Python/UPT/UPT.py | = 2 +- BaseTools/Source/Python/UPT/Xml/CommonXml.py | = 2 +- BaseTools/Source/Python/UPT/Xml/XmlParser.py | = 24 +-- BaseTools/Source/Python/Workspace/DecBuildData.py | = 14 +- BaseTools/Source/Python/Workspace/DscBuildData.py | 1= 78 ++++++++++---------- BaseTools/Source/Python/Workspace/MetaFileParser.py | = 36 ++-- BaseTools/Source/Python/Workspace/MetaFileTable.py | = 6 +- BaseTools/Source/Python/Workspace/WorkspaceCommon.py | = 2 +- BaseTools/Source/Python/build/BuildReport.py | = 8 +- BaseTools/Source/Python/build/build.py | = 8 +- BaseTools/Tests/TestTools.py | = 2 +- BaseTools/gcc/mingw-gcc-build.py | = 2 +- 54 files changed, 475 insertions(+), 475 deletions(-) diff --git a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py b/BaseTools/= Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py index dd66c7111ac0..b226499e8450 100755 --- a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py +++ b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py @@ -48,7 +48,7 @@ def ConvertCygPathToDos(CygPath): DosPath =3D CygPath =20 # pipes.quote will add the extra \\ for us. - return DosPath.replace('/','\\') + return DosPath.replace('/', '\\') =20 =20 # we receive our options as a list, but we will be passing them to the she= ll as a line diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py index 7d8cd0a5cc25..0997ee408c05 100644 --- a/BaseTools/Scripts/BinToPcd.py +++ b/BaseTools/Scripts/BinToPcd.py @@ -42,13 +42,13 @@ if __name__ =3D=3D '__main__': return Value =20 def ValidatePcdName (Argument): - if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argu= ment) !=3D ['','']: + if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argu= ment) !=3D ['', '']: Message =3D '%s is not in the form .' % (Argument) raise argparse.ArgumentTypeError(Message) return Argument =20 def ValidateGuidName (Argument): - if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) !=3D ['','']: + if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) !=3D ['', '']: Message =3D '%s is not a valid GUID C name' % (Argument) raise argparse.ArgumentTypeError(Message) return Argument @@ -71,7 +71,7 @@ if __name__ =3D=3D '__main__': help =3D "Output filename for PCD value or PCD state= ment") parser.add_argument("-p", "--pcd", dest =3D 'PcdName', type =3D Validate= PcdName, help =3D "Name of the PCD in the form .") - parser.add_argument("-t", "--type", dest =3D 'PcdType', default =3D None= , choices =3D ['VPD','HII'], + parser.add_argument("-t", "--type", dest =3D 'PcdType', default =3D None= , choices =3D ['VPD', 'HII'], help =3D "PCD statement type (HII or VPD). Default = is standard.") parser.add_argument("-m", "--max-size", dest =3D 'MaxSize', type =3D Val= idateUnsignedInteger, help =3D "Maximum size of the PCD. Ignored with --t= ype HII.") @@ -85,7 +85,7 @@ if __name__ =3D=3D '__main__': help =3D "Increase output messages") parser.add_argument("-q", "--quiet", dest =3D 'Quiet', action =3D "store= _true", help =3D "Reduce output messages") - parser.add_argument("--debug", dest =3D 'Debug', type =3D int, metavar = =3D '[0-9]', choices =3D list(range(0,10)), default =3D 0, + parser.add_argument("--debug", dest =3D 'Debug', type =3D int, metavar = =3D '[0-9]', choices =3D list(range(0, 10)), default =3D 0, help =3D "Set debug level") =20 # diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Script= s/MemoryProfileSymbolGen.py index 3bc6a8897bcc..c9158800668d 100644 --- a/BaseTools/Scripts/MemoryProfileSymbolGen.py +++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py @@ -190,7 +190,7 @@ def processLine(newline): =20 driverPrefixLen =3D len("Driver - ") # get driver name - if cmp(newline[0:driverPrefixLen],"Driver - ") =3D=3D 0 : + if cmp(newline[0:driverPrefixLen], "Driver - ") =3D=3D 0 : driverlineList =3D newline.split(" ") driverName =3D driverlineList[2] #print "Checking : ", driverName @@ -213,7 +213,7 @@ def processLine(newline): else : symbolsFile.symbolsTable[driverName].parse_debug_file (driverN= ame, pdbName) =20 - elif cmp(newline,"") =3D=3D 0 : + elif cmp(newline, "") =3D=3D 0 : driverName =3D "" =20 # check entry line @@ -226,7 +226,7 @@ def processLine(newline): rvaName =3D "" symbolName =3D "" =20 - if cmp(rvaName,"") =3D=3D 0 : + if cmp(rvaName, "") =3D=3D 0 : return newline else : return newline + symbolName diff --git a/BaseTools/Scripts/PatchCheck.py b/BaseTools/Scripts/PatchCheck= .py index 51d4adf08b60..211db566cb25 100755 --- a/BaseTools/Scripts/PatchCheck.py +++ b/BaseTools/Scripts/PatchCheck.py @@ -286,7 +286,7 @@ class GitDiffCheck: if self.state =3D=3D START: if line.startswith('diff --git'): self.state =3D PRE_PATCH - self.filename =3D line[13:].split(' ',1)[0] + self.filename =3D line[13:].split(' ', 1)[0] self.is_newfile =3D False self.force_crlf =3D not self.filename.endswith('.sh') elif len(line.rstrip()) !=3D 0: diff --git a/BaseTools/Scripts/RunMakefile.py b/BaseTools/Scripts/RunMakefi= le.py index 48bc198c7671..6d0c4553c9eb 100644 --- a/BaseTools/Scripts/RunMakefile.py +++ b/BaseTools/Scripts/RunMakefile.py @@ -149,7 +149,7 @@ if __name__ =3D=3D '__main__': for Item in gArgs.Define: if '=3D' not in Item[0]: continue - Item =3D Item[0].split('=3D',1) + Item =3D Item[0].split('=3D', 1) CommandLine.append('%s=3D"%s"' % (Item[0], Item[1])) CommandLine.append('EXTRA_FLAGS=3D"%s"' % (gArgs.Remaining)) CommandLine.append(gArgs.BuildType) diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/= Python/AutoGen/AutoGen.py index 18da411f83a0..0017f66e5ec8 100644 --- a/BaseTools/Source/Python/AutoGen/AutoGen.py +++ b/BaseTools/Source/Python/AutoGen/AutoGen.py @@ -46,7 +46,7 @@ from Common.MultipleWorkspace import MultipleWorkspace as= mws import InfSectionParser import datetime import hashlib -from GenVar import VariableMgr,var_info +from GenVar import VariableMgr, var_info =20 ## Regular expression for splitting Dependency Expression string into toke= ns gDepexTokenPattern =3D re.compile("(\(|\)|\w+| \S+\.inf)") @@ -1286,7 +1286,7 @@ class PlatformAutoGen(AutoGen): ShareFixedAtBuildPcdsSameValue =3D {}=20 for Module in LibAuto._ReferenceModules: =20 for Pcd in Module.FixedAtBuildPcds + LibAuto.FixedAtBuildP= cds: - key =3D ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCNa= me)) =20 + key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCN= ame)) if key not in FixedAtBuildPcds: ShareFixedAtBuildPcdsSameValue[key] =3D True FixedAtBuildPcds[key] =3D Pcd.DefaultValue @@ -1294,11 +1294,11 @@ class PlatformAutoGen(AutoGen): if FixedAtBuildPcds[key] !=3D Pcd.DefaultValue: ShareFixedAtBuildPcdsSameValue[key] =3D False = =20 for Pcd in LibAuto.FixedAtBuildPcds: - key =3D ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName)) - if (Pcd.TokenCName,Pcd.TokenSpaceGuidCName) not in self.No= nDynamicPcdDict: + key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName)) + if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in self.N= onDynamicPcdDict: continue else: - DscPcd =3D self.NonDynamicPcdDict[(Pcd.TokenCName,Pcd.= TokenSpaceGuidCName)] + DscPcd =3D self.NonDynamicPcdDict[(Pcd.TokenCName, Pcd= .TokenSpaceGuidCName)] if DscPcd.Type !=3D "FixedAtBuild": continue if key in ShareFixedAtBuildPcdsSameValue and ShareFixedAtB= uildPcdsSameValue[key]: =20 @@ -1318,12 +1318,12 @@ class PlatformAutoGen(AutoGen): break =20 =20 - VariableInfo =3D VariableMgr(self.DscBuildDataObj._GetDefaultStore= s(),self.DscBuildDataObj._GetSkuIds()) + VariableInfo =3D VariableMgr(self.DscBuildDataObj._GetDefaultStore= s(), self.DscBuildDataObj._GetSkuIds()) VariableInfo.SetVpdRegionMaxSize(VpdRegionSize) VariableInfo.SetVpdRegionOffset(VpdRegionBase) Index =3D 0 for Pcd in DynamicPcdSet: - pcdname =3D ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName)) + pcdname =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName)) for SkuName in Pcd.SkuInfoList: Sku =3D Pcd.SkuInfoList[SkuName] SkuId =3D Sku.SkuId @@ -1333,11 +1333,11 @@ class PlatformAutoGen(AutoGen): VariableGuidStructure =3D Sku.VariableGuidValue VariableGuid =3D GuidStructureStringToGuidString(Varia= bleGuidStructure) for StorageName in Sku.DefaultStoreDict: - VariableInfo.append_variable(var_info(Index,pcdnam= e,StorageName,SkuName, StringToArray(Sku.VariableName),VariableGuid, Sku.Va= riableOffset, Sku.VariableAttribute , Sku.HiiDefaultValue,Sku.DefaultStoreD= ict[StorageName],Pcd.DatumType)) + VariableInfo.append_variable(var_info(Index, pcdna= me, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid, Sk= u.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.DefaultSt= oreDict[StorageName], Pcd.DatumType)) Index +=3D 1 return VariableInfo =20 - def UpdateNVStoreMaxSize(self,OrgVpdFile): + def UpdateNVStoreMaxSize(self, OrgVpdFile): if self.VariableInfo: VpdMapFilePath =3D os.path.join(self.BuildDir, "FV", "%s.map" = % self.Platform.VpdToolGuid) PcdNvStoreDfBuffer =3D [item for item in self._DynamicPcdList = if item.TokenCName =3D=3D "PcdNvStoreDefaultValueBuffer" and item.TokenSpac= eGuidCName =3D=3D "gEfiMdeModulePkgTokenSpaceGuid"] @@ -1350,7 +1350,7 @@ class PlatformAutoGen(AutoGen): else: EdkLogger.error("build", FILE_READ_FAILURE, "Can not f= ind VPD map file %s to fix up VPD offset." % VpdMapFilePath) =20 - NvStoreOffset =3D int(NvStoreOffset,16) if NvStoreOffset.u= pper().startswith("0X") else int(NvStoreOffset) + NvStoreOffset =3D int(NvStoreOffset, 16) if NvStoreOffset.= upper().startswith("0X") else int(NvStoreOffset) default_skuobj =3D PcdNvStoreDfBuffer[0].SkuInfoList.get("= DEFAULT") maxsize =3D self.VariableInfo.VpdRegionSize - NvStoreOffs= et if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultValue.= split(",")) var_data =3D self.VariableInfo.PatchNVStoreDefaultMaxSize(= maxsize) @@ -1598,7 +1598,7 @@ class PlatformAutoGen(AutoGen): VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = =3D Pcd =20 #Collect DynamicHii PCD values and assign it to DynamicExVpd P= CD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer - PcdNvStoreDfBuffer =3D VpdPcdDict.get(("PcdNvStoreDefaultValue= Buffer","gEfiMdeModulePkgTokenSpaceGuid")) + PcdNvStoreDfBuffer =3D VpdPcdDict.get(("PcdNvStoreDefaultValue= Buffer", "gEfiMdeModulePkgTokenSpaceGuid")) if PcdNvStoreDfBuffer: self.VariableInfo =3D self.CollectVariables(self._DynamicP= cdList) vardump =3D self.VariableInfo.dump() @@ -1625,10 +1625,10 @@ class PlatformAutoGen(AutoGen): PcdValue =3D DefaultSku.DefaultValue if PcdValue not in SkuValueMap: SkuValueMap[PcdValue] =3D [] - VpdFile.Add(Pcd, 'DEFAULT',DefaultSku.VpdOffse= t) + VpdFile.Add(Pcd, 'DEFAULT', DefaultSku.VpdOffs= et) SkuValueMap[PcdValue].append(DefaultSku) =20 - for (SkuName,Sku) in Pcd.SkuInfoList.items(): + for (SkuName, Sku) in Pcd.SkuInfoList.items(): Sku.VpdOffset =3D Sku.VpdOffset.strip() PcdValue =3D Sku.DefaultValue if PcdValue =3D=3D "": @@ -1654,7 +1654,7 @@ class PlatformAutoGen(AutoGen): EdkLogger.error("build", FORMAT_INVALI= D, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.TokenS= paceGuidCName, Pcd.TokenCName, Alignment)) if PcdValue not in SkuValueMap: SkuValueMap[PcdValue] =3D [] - VpdFile.Add(Pcd, SkuName,Sku.VpdOffset) + VpdFile.Add(Pcd, SkuName, Sku.VpdOffset) SkuValueMap[PcdValue].append(Sku) # if the offset of a VPD is *, then it need to be = fixed up by third party tool. if not NeedProcessVpdMapFile and Sku.VpdOffset =3D= =3D "*": @@ -1686,9 +1686,9 @@ class PlatformAutoGen(AutoGen): SkuObjList =3D DscPcdEntry.SkuInfoList.items() DefaultSku =3D DscPcdEntry.SkuInfoList.get('DE= FAULT') if DefaultSku: - defaultindex =3D SkuObjList.index(('DEFAUL= T',DefaultSku)) - SkuObjList[0],SkuObjList[defaultindex] =3D= SkuObjList[defaultindex],SkuObjList[0] - for (SkuName,Sku) in SkuObjList: + defaultindex =3D SkuObjList.index(('DEFAUL= T', DefaultSku)) + SkuObjList[0], SkuObjList[defaultindex] = =3D SkuObjList[defaultindex], SkuObjList[0] + for (SkuName, Sku) in SkuObjList: Sku.VpdOffset =3D Sku.VpdOffset.strip()=20 =20 # Need to iterate DEC pcd information to g= et the value & datumtype @@ -1738,7 +1738,7 @@ class PlatformAutoGen(AutoGen): EdkLogger.error("build", FORMA= T_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Ds= cPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment)) if PcdValue not in SkuValueMap: SkuValueMap[PcdValue] =3D [] - VpdFile.Add(DscPcdEntry, SkuName,Sku.V= pdOffset) + VpdFile.Add(DscPcdEntry, SkuName, Sku.= VpdOffset) SkuValueMap[PcdValue].append(Sku) if not NeedProcessVpdMapFile and Sku.VpdOf= fset =3D=3D "*": NeedProcessVpdMapFile =3D True=20 @@ -1804,17 +1804,17 @@ class PlatformAutoGen(AutoGen): self._DynamicPcdList.extend(list(UnicodePcdArray)) self._DynamicPcdList.extend(list(HiiPcdArray)) self._DynamicPcdList.extend(list(OtherPcdArray)) - allskuset =3D [(SkuName,Sku.SkuId) for pcd in self._DynamicPcdList= for (SkuName,Sku) in pcd.SkuInfoList.items()] + allskuset =3D [(SkuName, Sku.SkuId) for pcd in self._DynamicPcdLis= t for (SkuName, Sku) in pcd.SkuInfoList.items()] for pcd in self._DynamicPcdList: if len(pcd.SkuInfoList) =3D=3D 1: - for (SkuName,SkuId) in allskuset: - if type(SkuId) in (str,unicode) and eval(SkuId) =3D=3D= 0 or SkuId =3D=3D 0: + for (SkuName, SkuId) in allskuset: + if type(SkuId) in (str, unicode) and eval(SkuId) =3D= =3D 0 or SkuId =3D=3D 0: continue pcd.SkuInfoList[SkuName] =3D copy.deepcopy(pcd.SkuInfo= List['DEFAULT']) pcd.SkuInfoList[SkuName].SkuId =3D SkuId self.AllPcdList =3D self._NonDynamicPcdList + self._DynamicPcdList =20 - def FixVpdOffset(self,VpdFile ): + def FixVpdOffset(self, VpdFile): FvPath =3D os.path.join(self.BuildDir, "FV") if not os.path.exists(FvPath): try: @@ -2076,7 +2076,7 @@ class PlatformAutoGen(AutoGen): if self._NonDynamicPcdDict: return self._NonDynamicPcdDict for Pcd in self.NonDynamicPcdList: - self._NonDynamicPcdDict[(Pcd.TokenCName,Pcd.TokenSpaceGuidCNam= e)] =3D Pcd + self._NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCNa= me)] =3D Pcd return self._NonDynamicPcdDict =20 ## Get list of non-dynamic PCDs @@ -3887,7 +3887,7 @@ class ModuleAutoGen(AutoGen): try: fInputfile =3D open(UniVfrOffsetFileName, "wb+", 0) except: - EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed = for %s" % UniVfrOffsetFileName,None) + EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed = for %s" % UniVfrOffsetFileName, None) =20 # Use a instance of StringIO to cache data fStringIO =3D StringIO('') =20 @@ -3923,7 +3923,7 @@ class ModuleAutoGen(AutoGen): fInputfile.write (fStringIO.getvalue()) except: EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to fi= le %s failed, please check whether the " - "file been locked or using by other applicatio= ns." %UniVfrOffsetFileName,None) + "file been locked or using by other applicatio= ns." %UniVfrOffsetFileName, None) =20 fStringIO.close () fInputfile.close () @@ -4370,7 +4370,7 @@ class ModuleAutoGen(AutoGen): def CopyBinaryFiles(self): for File in self.Module.Binaries: SrcPath =3D File.Path - DstPath =3D os.path.join(self.OutputDir , os.path.basename(Src= Path)) + DstPath =3D os.path.join(self.OutputDir, os.path.basename(SrcP= ath)) CopyLongFilePath(SrcPath, DstPath) ## Create autogen code for the module and its dependent libraries # @@ -4521,7 +4521,7 @@ class ModuleAutoGen(AutoGen): if SrcTimeStamp > DstTimeStamp: return False =20 - with open(self.GetTimeStampPath(),'r') as f: + with open(self.GetTimeStampPath(), 'r') as f: for source in f: source =3D source.rstrip('\n') if not os.path.exists(source): diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/= Python/AutoGen/GenMake.py index 8891b1b97d23..eb56d0e7c5a3 100644 --- a/BaseTools/Source/Python/AutoGen/GenMake.py +++ b/BaseTools/Source/Python/AutoGen/GenMake.py @@ -746,7 +746,7 @@ cleanlib: if CmdName =3D=3D 'Trim': SecDepsFileList.append(os.path.join('$(DEBUG_D= IR)', os.path.basename(OutputFile).replace('offset', 'efi'))) if OutputFile.endswith('.ui') or OutputFile.endswi= th('.ver'): - SecDepsFileList.append(os.path.join('$(MODULE_= DIR)','$(MODULE_FILE)')) + SecDepsFileList.append(os.path.join('$(MODULE_= DIR)', '$(MODULE_FILE)')) self.FfsOutputFileList.append((OutputFile, ' '.joi= n(SecDepsFileList), SecCmdStr)) if len(SecDepsFileList) > 0: self.ParseSecCmd(SecDepsFileList, CmdTuple) @@ -864,7 +864,7 @@ cleanlib: for Target in BuildTargets: for i, SingleCommand in enumerate(BuildTargets= [Target].Commands): if FlagDict[Flag]['Macro'] in SingleComman= d: - BuildTargets[Target].Commands[i] =3D S= ingleCommand.replace('$(INC)','').replace(FlagDict[Flag]['Macro'], RespMacr= o) + BuildTargets[Target].Commands[i] =3D S= ingleCommand.replace('$(INC)', '').replace(FlagDict[Flag]['Macro'], RespMac= ro) return RespDict =20 def ProcessBuildTargetList(self): diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source= /Python/AutoGen/GenPcdDb.py index a989cb34dff3..85e6f44502a2 100644 --- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py +++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py @@ -270,7 +270,7 @@ def toHex(s): hv =3D '0'+hv lst.append(hv) if lst: - return reduce(lambda x,y:x+y, lst) + return reduce(lambda x, y:x+y, lst) else: return 'empty' ## DbItemList @@ -650,22 +650,22 @@ def StringArrayToList(StringArray): # def GetTokenTypeValue(TokenType): TokenTypeDict =3D { - "PCD_TYPE_SHIFT":28, - "PCD_TYPE_DATA":(0x0 << 28), - "PCD_TYPE_HII":(0x8 << 28), - "PCD_TYPE_VPD":(0x4 << 28), + "PCD_TYPE_SHIFT": 28, + "PCD_TYPE_DATA": (0x0 << 28), + "PCD_TYPE_HII": (0x8 << 28), + "PCD_TYPE_VPD": (0x4 << 28), # "PCD_TYPE_SKU_ENABLED":(0x2 << 28), - "PCD_TYPE_STRING":(0x1 << 28), + "PCD_TYPE_STRING": (0x1 << 28), =20 - "PCD_DATUM_TYPE_SHIFT":24, - "PCD_DATUM_TYPE_POINTER":(0x0 << 24), - "PCD_DATUM_TYPE_UINT8":(0x1 << 24), - "PCD_DATUM_TYPE_UINT16":(0x2 << 24), - "PCD_DATUM_TYPE_UINT32":(0x4 << 24), - "PCD_DATUM_TYPE_UINT64":(0x8 << 24), + "PCD_DATUM_TYPE_SHIFT": 24, + "PCD_DATUM_TYPE_POINTER": (0x0 << 24), + "PCD_DATUM_TYPE_UINT8": (0x1 << 24), + "PCD_DATUM_TYPE_UINT16": (0x2 << 24), + "PCD_DATUM_TYPE_UINT32": (0x4 << 24), + "PCD_DATUM_TYPE_UINT64": (0x8 << 24), =20 - "PCD_DATUM_TYPE_SHIFT2":20, - "PCD_DATUM_TYPE_UINT8_BOOLEAN":(0x1 << 20 | 0x1 << 24), + "PCD_DATUM_TYPE_SHIFT2": 20, + "PCD_DATUM_TYPE_UINT8_BOOLEAN": (0x1 << 20 | 0x1 << 24), } return eval(TokenType, TokenTypeDict) =20 @@ -719,7 +719,7 @@ def BuildExDataBase(Dict): DbPcdCNameTable =3D DbStringItemList(0, RawDataList =3D PcdCNameTableV= alue, LenList =3D PcdCNameLen) =20 PcdNameOffsetTable =3D Dict['PCD_NAME_OFFSET'] - DbPcdNameOffsetTable =3D DbItemList(4,RawDataList =3D PcdNameOffsetTab= le) + DbPcdNameOffsetTable =3D DbItemList(4, RawDataList =3D PcdNameOffsetTa= ble) =20 SizeTableValue =3D zip(Dict['SIZE_TABLE_MAXIMUM_LENGTH'], Dict['SIZE_T= ABLE_CURRENT_LENGTH']) DbSizeTableValue =3D DbSizeTableItemList(2, RawDataList =3D SizeTableV= alue) @@ -754,16 +754,16 @@ def BuildExDataBase(Dict): PcdTokenNumberMap =3D Dict['PCD_ORDER_TOKEN_NUMBER_MAP'] =20 DbNameTotle =3D ["SkuidValue", "InitValueUint64", "VardefValueUint64"= , "InitValueUint32", "VardefValueUint32", "VpdHeadValue", "ExMapTable", - "LocalTokenNumberTable", "GuidTable", "StringHeadValue", "= PcdNameOffsetTable","VariableTable", "StringTableLen", "PcdTokenTable", "Pc= dCNameTable", + "LocalTokenNumberTable", "GuidTable", "StringHeadValue", "= PcdNameOffsetTable", "VariableTable", "StringTableLen", "PcdTokenTable", "P= cdCNameTable", "SizeTableValue", "InitValueUint16", "VardefValueUint16", "= InitValueUint8", "VardefValueUint8", "InitValueBoolean", "VardefValueBoolean", "UnInitValueUint64", "UnInitValueUint= 32", "UnInitValueUint16", "UnInitValueUint8", "UnInitValueBoolean"] =20 DbTotal =3D [SkuidValue, InitValueUint64, VardefValueUint64, InitValu= eUint32, VardefValueUint32, VpdHeadValue, ExMapTable, - LocalTokenNumberTable, GuidTable, StringHeadValue, PcdName= OffsetTable,VariableTable, StringTableLen, PcdTokenTable,PcdCNameTable, + LocalTokenNumberTable, GuidTable, StringHeadValue, PcdName= OffsetTable, VariableTable, StringTableLen, PcdTokenTable, PcdCNameTable, SizeTableValue, InitValueUint16, VardefValueUint16, InitVal= ueUint8, VardefValueUint8, InitValueBoolean, VardefValueBoolean, UnInitValueUint64, UnInitValueUint32, U= nInitValueUint16, UnInitValueUint8, UnInitValueBoolean] DbItemTotal =3D [DbSkuidValue, DbInitValueUint64, DbVardefValueUint64= , DbInitValueUint32, DbVardefValueUint32, DbVpdHeadValue, DbExMapTable, - DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue, D= bPcdNameOffsetTable,DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbP= cdCNameTable, + DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue, D= bPcdNameOffsetTable, DbVariableTable, DbStringTableLen, DbPcdTokenTable, Db= PcdCNameTable, DbSizeTableValue, DbInitValueUint16, DbVardefValueUint16, D= bInitValueUint8, DbVardefValueUint8, DbInitValueBoolean, DbVardefValueBoolean, DbUnInitValueUint64, DbUnInitValueUin= t32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean] =20 @@ -822,7 +822,7 @@ def BuildExDataBase(Dict): DbOffset +=3D (8 - DbOffset % 8) else: assert(False) - if isinstance(VariableRefTable[0],list): + if isinstance(VariableRefTable[0], list): DbOffset +=3D skuindex * 4 =20 skuindex +=3D 1 if DbIndex >=3D InitTableNum: @@ -984,46 +984,46 @@ def CreatePcdDataBase(PcdDBData): basedata =3D {} if not PcdDBData: return "" - for skuname,skuid in PcdDBData: - if len(PcdDBData[(skuname,skuid)][1]) !=3D len(PcdDBData[("DEFAULT= ","0")][1]): + for skuname, skuid in PcdDBData: + if len(PcdDBData[(skuname, skuid)][1]) !=3D len(PcdDBData[("DEFAUL= T", "0")][1]): EdkLogger.ERROR("The size of each sku in one pcd are not same") - for skuname,skuid in PcdDBData: + for skuname, skuid in PcdDBData: if skuname =3D=3D "DEFAULT": continue - delta[(skuname,skuid)] =3D [(index,data,hex(data)) for index,data = in enumerate(PcdDBData[(skuname,skuid)][1]) if PcdDBData[(skuname,skuid)][1= ][index] !=3D PcdDBData[("DEFAULT","0")][1][index]] - basedata[(skuname,skuid)] =3D [(index,PcdDBData[("DEFAULT","0")][1= ][index],hex(PcdDBData[("DEFAULT","0")][1][index])) for index,data in enume= rate(PcdDBData[(skuname,skuid)][1]) if PcdDBData[(skuname,skuid)][1][index]= !=3D PcdDBData[("DEFAULT","0")][1][index]] - databasebuff =3D PcdDBData[("DEFAULT","0")][0] + delta[(skuname, skuid)] =3D [(index, data, hex(data)) for index, d= ata in enumerate(PcdDBData[(skuname, skuid)][1]) if PcdDBData[(skuname, sku= id)][1][index] !=3D PcdDBData[("DEFAULT", "0")][1][index]] + basedata[(skuname, skuid)] =3D [(index, PcdDBData[("DEFAULT", "0")= ][1][index], hex(PcdDBData[("DEFAULT", "0")][1][index])) for index, data in= enumerate(PcdDBData[(skuname, skuid)][1]) if PcdDBData[(skuname, skuid)][1= ][index] !=3D PcdDBData[("DEFAULT", "0")][1][index]] + databasebuff =3D PcdDBData[("DEFAULT", "0")][0] =20 - for skuname,skuid in delta: + for skuname, skuid in delta: # 8 byte align if len(databasebuff) % 8 > 0: for i in range(8 - (len(databasebuff) % 8)): - databasebuff +=3D pack("=3DB",0) + databasebuff +=3D pack("=3DB", 0) databasebuff +=3D pack('=3DQ', int(skuid)) databasebuff +=3D pack('=3DQ', 0) - databasebuff +=3D pack('=3DL', 8+8+4+4*len(delta[(skuname,skuid)])) - for item in delta[(skuname,skuid)]: - databasebuff +=3D pack("=3DL",item[0]) - databasebuff =3D databasebuff[:-1] + pack("=3DB",item[1]) + databasebuff +=3D pack('=3DL', 8+8+4+4*len(delta[(skuname, skuid)]= )) + for item in delta[(skuname, skuid)]: + databasebuff +=3D pack("=3DL", item[0]) + databasebuff =3D databasebuff[:-1] + pack("=3DB", item[1]) totallen =3D len(databasebuff) - totallenbuff =3D pack("=3DL",totallen) + totallenbuff =3D pack("=3DL", totallen) newbuffer =3D databasebuff[:32] for i in range(4): newbuffer +=3D totallenbuff[i] - for i in range(36,totallen): + for i in range(36, totallen): newbuffer +=3D databasebuff[i] =20 return newbuffer def CreateVarCheckBin(VarCheckTab): - return VarCheckTab[('DEFAULT',"0")] + return VarCheckTab[('DEFAULT', "0")] def CreateAutoGen(PcdDriverAutoGenData): autogenC =3D TemplateString() - for skuname,skuid in PcdDriverAutoGenData: + for skuname, skuid in PcdDriverAutoGenData: autogenC.Append("//SKUID: %s" % skuname) - autogenC.Append(PcdDriverAutoGenData[(skuname,skuid)][1].String) - return (PcdDriverAutoGenData[(skuname,skuid)][0],autogenC) -def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform,Phase): - def prune_sku(pcd,skuname): + autogenC.Append(PcdDriverAutoGenData[(skuname, skuid)][1].String) + return (PcdDriverAutoGenData[(skuname, skuid)][0], autogenC) +def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform, Phase): + def prune_sku(pcd, skuname): new_pcd =3D copy.deepcopy(pcd) new_pcd.SkuInfoList =3D {skuname:pcd.SkuInfoList[skuname]} new_pcd.isinit =3D 'INIT' @@ -1041,28 +1041,28 @@ def NewCreatePcdDatabasePhaseSpecificAutoGen(Platfo= rm,Phase): new_pcd.isinit =3D "UNINIT" return new_pcd DynamicPcds =3D Platform.DynamicPcdList - DynamicPcdSet_Sku =3D {(SkuName,skuobj.SkuId):[] for pcd in DynamicPcd= s for (SkuName,skuobj) in pcd.SkuInfoList.items() } - for skuname,skuid in DynamicPcdSet_Sku: - DynamicPcdSet_Sku[(skuname,skuid)] =3D [prune_sku(pcd,skuname) for= pcd in DynamicPcds] + DynamicPcdSet_Sku =3D {(SkuName, skuobj.SkuId):[] for pcd in DynamicPc= ds for (SkuName, skuobj) in pcd.SkuInfoList.items() } + for skuname, skuid in DynamicPcdSet_Sku: + DynamicPcdSet_Sku[(skuname, skuid)] =3D [prune_sku(pcd, skuname) f= or pcd in DynamicPcds] PcdDBData =3D {} PcdDriverAutoGenData =3D {} VarCheckTableData =3D {} if DynamicPcdSet_Sku: - for skuname,skuid in DynamicPcdSet_Sku: - AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer,VarCheckTa= b =3D CreatePcdDatabasePhaseSpecificAutoGen (Platform,DynamicPcdSet_Sku[(sk= uname,skuid)], Phase) + for skuname, skuid in DynamicPcdSet_Sku: + AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckT= ab =3D CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdSet_Sku[(= skuname, skuid)], Phase) final_data =3D () for item in PcdDbBuffer: - final_data +=3D unpack("B",item) - PcdDBData[(skuname,skuid)] =3D (PcdDbBuffer, final_data) - PcdDriverAutoGenData[(skuname,skuid)] =3D (AdditionalAutoGenH,= AdditionalAutoGenC) - VarCheckTableData[(skuname,skuid)] =3D VarCheckTab + final_data +=3D unpack("B", item) + PcdDBData[(skuname, skuid)] =3D (PcdDbBuffer, final_data) + PcdDriverAutoGenData[(skuname, skuid)] =3D (AdditionalAutoGenH= , AdditionalAutoGenC) + VarCheckTableData[(skuname, skuid)] =3D VarCheckTab if Platform.Platform.VarCheckFlag: dest =3D os.path.join(Platform.BuildDir, 'FV') VarCheckTable =3D CreateVarCheckBin(VarCheckTableData) VarCheckTable.dump(dest, Phase) AdditionalAutoGenH, AdditionalAutoGenC =3D CreateAutoGen(PcdDrive= rAutoGenData) else: - AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer,VarCheckTab = =3D CreatePcdDatabasePhaseSpecificAutoGen (Platform,{}, Phase) + AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = =3D CreatePcdDatabasePhaseSpecificAutoGen (Platform, {}, Phase) =20 PcdDbBuffer =3D CreatePcdDataBase(PcdDBData) return AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer @@ -1103,20 +1103,20 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform= , DynamicPcdList, Phase): =20 Dict['PCD_INFO_FLAG'] =3D Platform.Platform.PcdInfoFlag =20 - for DatumType in ['UINT64','UINT32','UINT16','UINT8','BOOLEAN', "VOID*= "]: + for DatumType in ['UINT64', 'UINT32', 'UINT16', 'UINT8', 'BOOLEAN', "V= OID*"]: Dict['VARDEF_CNAME_' + DatumType] =3D [] Dict['VARDEF_GUID_' + DatumType] =3D [] Dict['VARDEF_SKUID_' + DatumType] =3D [] Dict['VARDEF_VALUE_' + DatumType] =3D [] Dict['VARDEF_DB_VALUE_' + DatumType] =3D [] - for Init in ['INIT','UNINIT']: + for Init in ['INIT', 'UNINIT']: Dict[Init+'_CNAME_DECL_' + DatumType] =3D [] Dict[Init+'_GUID_DECL_' + DatumType] =3D [] Dict[Init+'_NUMSKUS_DECL_' + DatumType] =3D [] Dict[Init+'_VALUE_' + DatumType] =3D [] Dict[Init+'_DB_VALUE_'+DatumType] =3D [] =20 - for Type in ['STRING_HEAD','VPD_HEAD','VARIABLE_HEAD']: + for Type in ['STRING_HEAD', 'VPD_HEAD', 'VARIABLE_HEAD']: Dict[Type + '_CNAME_DECL'] =3D [] Dict[Type + '_GUID_DECL'] =3D [] Dict[Type + '_NUMSKUS_DECL'] =3D [] @@ -1284,7 +1284,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, = DynamicPcdList, Phase): Dict['STRING_TABLE_INDEX'].append('') else: Dict['STRING_TABLE_INDEX'].append('_%d' % StringTa= bleIndex) - VarNameSize =3D len(VariableNameStructure.replace(',',= ' ').split()) + VarNameSize =3D len(VariableNameStructure.replace(',',= ' ').split()) Dict['STRING_TABLE_LENGTH'].append(VarNameSize ) Dict['STRING_TABLE_VALUE'].append(VariableNameStructur= e) StringHeadOffsetList.append(str(StringTableSize) + 'U') @@ -1292,7 +1292,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, = DynamicPcdList, Phase): VarStringDbOffsetList.append(StringTableSize) Dict['STRING_DB_VALUE'].append(VarStringDbOffsetList) StringTableIndex +=3D 1 - StringTableSize +=3D len(VariableNameStructure.replace= (',',' ').split()) + StringTableSize +=3D len(VariableNameStructure.replace= (',', ' ').split()) VariableHeadStringIndex =3D 0 for Index in range(Dict['STRING_TABLE_VALUE'].index(Variab= leNameStructure)): VariableHeadStringIndex +=3D Dict['STRING_TABLE_LENGTH= '][Index] @@ -1331,7 +1331,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, = DynamicPcdList, Phase): elif Pcd.DatumType in ("UINT32", "UINT16", "UINT8"): Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.Hii= DefaultValue + "U") elif Pcd.DatumType =3D=3D "BOOLEAN": - if eval(Sku.HiiDefaultValue) in [1,0]: + if eval(Sku.HiiDefaultValue) in [1, 0]: Dict['VARDEF_VALUE_'+Pcd.DatumType].append(str= (eval(Sku.HiiDefaultValue)) + "U") else: Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.Hii= DefaultValue) @@ -1381,7 +1381,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, = DynamicPcdList, Phase): Dict['STRING_TABLE_INDEX'].append('_%d' % StringTa= bleIndex) if Sku.DefaultValue[0] =3D=3D 'L': DefaultValueBinStructure =3D StringToArray(Sku.Def= aultValue) - Size =3D len(DefaultValueBinStructure.replace(',',= ' ').split()) + Size =3D len(DefaultValueBinStructure.replace(',',= ' ').split()) Dict['STRING_TABLE_VALUE'].append(DefaultValueBinS= tructure) elif Sku.DefaultValue[0] =3D=3D '"': DefaultValueBinStructure =3D StringToArray(Sku.Def= aultValue) @@ -1696,7 +1696,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, = DynamicPcdList, Phase): =20 # print Phase Buffer =3D BuildExDataBase(Dict) - return AutoGenH, AutoGenC, Buffer,VarCheckTab + return AutoGenH, AutoGenC, Buffer, VarCheckTab =20 def GetOrderedDynamicPcdList(DynamicPcdList, PcdTokenNumberList): ReorderedDyPcdList =3D [None for i in range(len(DynamicPcdList))] diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/P= ython/AutoGen/GenVar.py index b82d7e4d2d37..a0b3497207b7 100644 --- a/BaseTools/Source/Python/AutoGen/GenVar.py +++ b/BaseTools/Source/Python/AutoGen/GenVar.py @@ -15,7 +15,7 @@ # Import Modules # from builtins import range -from struct import pack,unpack +from struct import pack, unpack import collections import copy from Common.VariableAttributes import VariableAttributes @@ -49,7 +49,7 @@ def PackGUID(Guid): return GuidBuffer =20 class VariableMgr(object): - def __init__(self, DefaultStoreMap,SkuIdMap): + def __init__(self, DefaultStoreMap, SkuIdMap): self.VarInfo =3D [] self.DefaultStoreMap =3D DefaultStoreMap self.SkuIdMap =3D SkuIdMap @@ -59,19 +59,19 @@ class VariableMgr(object): self.VarDefaultBuff =3D None self.VarDeltaBuff =3D None =20 - def append_variable(self,uefi_var): + def append_variable(self, uefi_var): self.VarInfo.append(uefi_var) =20 - def SetVpdRegionMaxSize(self,maxsize): + def SetVpdRegionMaxSize(self, maxsize): self.VpdRegionSize =3D maxsize =20 - def SetVpdRegionOffset(self,vpdoffset): + def SetVpdRegionOffset(self, vpdoffset): self.VpdRegionOffset =3D vpdoffset =20 - def PatchNVStoreDefaultMaxSize(self,maxsize): + def PatchNVStoreDefaultMaxSize(self, maxsize): if not self.NVHeaderBuff: return "" - self.NVHeaderBuff =3D self.NVHeaderBuff[:8] + pack("=3DQ",maxsize) + self.NVHeaderBuff =3D self.NVHeaderBuff[:8] + pack("=3DQ", maxsize) default_var_bin =3D self.format_data(self.NVHeaderBuff + self.VarD= efaultBuff + self.VarDeltaBuff) value_str =3D "{" default_var_bin_strip =3D [ data.strip("""'""") for data in defaul= t_var_bin] @@ -118,7 +118,7 @@ class VariableMgr(object): for item in self.VarInfo: if item.pcdindex not in indexedvarinfo: indexedvarinfo[item.pcdindex] =3D dict() - indexedvarinfo[item.pcdindex][(item.skuname,item.defaultstorag= ename)] =3D item + indexedvarinfo[item.pcdindex][(item.skuname, item.defaultstora= gename)] =3D item =20 for index in indexedvarinfo: sku_var_info =3D indexedvarinfo[index] @@ -126,44 +126,44 @@ class VariableMgr(object): default_data_buffer =3D "" others_data_buffer =3D "" tail =3D None - default_sku_default =3D indexedvarinfo.get(index).get(("DEFAUL= T","STANDARD")) + default_sku_default =3D indexedvarinfo.get(index).get(("DEFAUL= T", "STANDARD")) =20 - if default_sku_default.data_type not in ["UINT8","UINT16","UIN= T32","UINT64","BOOLEAN"]: + if default_sku_default.data_type not in ["UINT8", "UINT16", "U= INT32", "UINT64", "BOOLEAN"]: var_max_len =3D max([len(var_item.default_value.split(",")= ) for var_item in sku_var_info.values()]) if len(default_sku_default.default_value.split(",")) < var= _max_len: tail =3D ",".join([ "0x00" for i in range(var_max_len-= len(default_sku_default.default_value.split(",")))]) =20 - default_data_buffer =3D self.PACK_VARIABLES_DATA(default_sku_d= efault.default_value,default_sku_default.data_type,tail) + default_data_buffer =3D self.PACK_VARIABLES_DATA(default_sku_d= efault.default_value, default_sku_default.data_type, tail) =20 default_data_array =3D () for item in default_data_buffer: - default_data_array +=3D unpack("B",item) + default_data_array +=3D unpack("B", item) =20 - if ("DEFAULT","STANDARD") not in var_data: - var_data[("DEFAULT","STANDARD")] =3D collections.OrderedDi= ct() - var_data[("DEFAULT","STANDARD")][index] =3D (default_data_buff= er,sku_var_info[("DEFAULT","STANDARD")]) + if ("DEFAULT", "STANDARD") not in var_data: + var_data[("DEFAULT", "STANDARD")] =3D collections.OrderedD= ict() + var_data[("DEFAULT", "STANDARD")][index] =3D (default_data_buf= fer, sku_var_info[("DEFAULT", "STANDARD")]) =20 - for (skuid,defaultstoragename) in indexedvarinfo.get(index): + for (skuid, defaultstoragename) in indexedvarinfo.get(index): tail =3D None - if (skuid,defaultstoragename) =3D=3D ("DEFAULT","STANDARD"= ): + if (skuid, defaultstoragename) =3D=3D ("DEFAULT", "STANDAR= D"): continue - other_sku_other =3D indexedvarinfo.get(index).get((skuid,d= efaultstoragename)) + other_sku_other =3D indexedvarinfo.get(index).get((skuid, = defaultstoragename)) =20 - if default_sku_default.data_type not in ["UINT8","UINT16",= "UINT32","UINT64","BOOLEAN"]: + if default_sku_default.data_type not in ["UINT8", "UINT16"= , "UINT32", "UINT64", "BOOLEAN"]: if len(other_sku_other.default_value.split(",")) < var= _max_len: tail =3D ",".join([ "0x00" for i in range(var_max_= len-len(other_sku_other.default_value.split(",")))]) =20 - others_data_buffer =3D self.PACK_VARIABLES_DATA(other_sku_= other.default_value,other_sku_other.data_type,tail) + others_data_buffer =3D self.PACK_VARIABLES_DATA(other_sku_= other.default_value, other_sku_other.data_type, tail) =20 others_data_array =3D () for item in others_data_buffer: - others_data_array +=3D unpack("B",item) + others_data_array +=3D unpack("B", item) =20 data_delta =3D self.calculate_delta(default_data_array, ot= hers_data_array) =20 - if (skuid,defaultstoragename) not in var_data: - var_data[(skuid,defaultstoragename)] =3D collections.O= rderedDict() - var_data[(skuid,defaultstoragename)][index] =3D (data_delt= a,sku_var_info[(skuid,defaultstoragename)]) + if (skuid, defaultstoragename) not in var_data: + var_data[(skuid, defaultstoragename)] =3D collections.= OrderedDict() + var_data[(skuid, defaultstoragename)][index] =3D (data_del= ta, sku_var_info[(skuid, defaultstoragename)]) return var_data =20 def new_process_varinfo(self): @@ -174,17 +174,17 @@ class VariableMgr(object): if not var_data: return [] =20 - pcds_default_data =3D var_data.get(("DEFAULT","STANDARD"),{}) + pcds_default_data =3D var_data.get(("DEFAULT", "STANDARD"), {}) NvStoreDataBuffer =3D "" var_data_offset =3D collections.OrderedDict() offset =3D NvStorageHeaderSize - for default_data,default_info in pcds_default_data.values(): + for default_data, default_info in pcds_default_data.values(): var_name_buffer =3D self.PACK_VARIABLE_NAME(default_info.var_n= ame) =20 vendorguid =3D default_info.var_guid.split('-') =20 if default_info.var_attribute: - var_attr_value,_ =3D VariableAttributes.GetVarAttributes(d= efault_info.var_attribute) + var_attr_value, _ =3D VariableAttributes.GetVarAttributes(= default_info.var_attribute) else: var_attr_value =3D 0x07 =20 @@ -203,22 +203,22 @@ class VariableMgr(object): nv_default_part =3D self.AlignData(self.PACK_DEFAULT_DATA(0, 0, se= lf.unpack_data(variable_storage_header_buffer+NvStoreDataBuffer)), 8) =20 data_delta_structure_buffer =3D "" - for skuname,defaultstore in var_data: - if (skuname,defaultstore) =3D=3D ("DEFAULT","STANDARD"): + for skuname, defaultstore in var_data: + if (skuname, defaultstore) =3D=3D ("DEFAULT", "STANDARD"): continue - pcds_sku_data =3D var_data.get((skuname,defaultstore)) + pcds_sku_data =3D var_data.get((skuname, defaultstore)) delta_data_set =3D [] for pcdindex in pcds_sku_data: offset =3D var_data_offset[pcdindex] - delta_data,_ =3D pcds_sku_data[pcdindex] + delta_data, _ =3D pcds_sku_data[pcdindex] delta_data =3D [(item[0] + offset, item[1]) for item in de= lta_data] delta_data_set.extend(delta_data) =20 - data_delta_structure_buffer +=3D self.AlignData(self.PACK_DELT= A_DATA(skuname,defaultstore,delta_data_set), 8) + data_delta_structure_buffer +=3D self.AlignData(self.PACK_DELT= A_DATA(skuname, defaultstore, delta_data_set), 8) =20 size =3D len(nv_default_part + data_delta_structure_buffer) + 16 maxsize =3D self.VpdRegionSize if self.VpdRegionSize else size - NV_Store_Default_Header =3D self.PACK_NV_STORE_DEFAULT_HEADER(size= ,maxsize) + NV_Store_Default_Header =3D self.PACK_NV_STORE_DEFAULT_HEADER(size= , maxsize) =20 self.NVHeaderBuff =3D NV_Store_Default_Header self.VarDefaultBuff =3Dnv_default_part @@ -226,14 +226,14 @@ class VariableMgr(object): return self.format_data(NV_Store_Default_Header + nv_default_part = + data_delta_structure_buffer) =20 =20 - def format_data(self,data): + def format_data(self, data): =20 return [hex(item) for item in self.unpack_data(data)] =20 - def unpack_data(self,data): + def unpack_data(self, data): final_data =3D () for item in data: - final_data +=3D unpack("B",item) + final_data +=3D unpack("B", item) return final_data =20 def calculate_delta(self, default, theother): @@ -242,7 +242,7 @@ class VariableMgr(object): data_delta =3D [] for i in range(len(default)): if default[i] !=3D theother[i]: - data_delta.append((i,theother[i])) + data_delta.append((i, theother[i])) return data_delta =20 def dump(self): @@ -256,40 +256,40 @@ class VariableMgr(object): return value_str return "" =20 - def PACK_VARIABLE_STORE_HEADER(self,size): + def PACK_VARIABLE_STORE_HEADER(self, size): #Signature: gEfiVariableGuid Guid =3D "{ 0xddcf3616, 0x3275, 0x4164, { 0x98, 0xb6, 0xfe, 0x85, = 0x70, 0x7f, 0xfe, 0x7d }}" Guid =3D GuidStructureStringToGuidString(Guid) GuidBuffer =3D PackGUID(Guid.split('-')) =20 - SizeBuffer =3D pack('=3DL',size) - FormatBuffer =3D pack('=3DB',0x5A) - StateBuffer =3D pack('=3DB',0xFE) - reservedBuffer =3D pack('=3DH',0) - reservedBuffer +=3D pack('=3DL',0) + SizeBuffer =3D pack('=3DL', size) + FormatBuffer =3D pack('=3DB', 0x5A) + StateBuffer =3D pack('=3DB', 0xFE) + reservedBuffer =3D pack('=3DH', 0) + reservedBuffer +=3D pack('=3DL', 0) =20 return GuidBuffer + SizeBuffer + FormatBuffer + StateBuffer + rese= rvedBuffer =20 - def PACK_NV_STORE_DEFAULT_HEADER(self,size,maxsize): - Signature =3D pack('=3DB',ord('N')) - Signature +=3D pack("=3DB",ord('S')) - Signature +=3D pack("=3DB",ord('D')) - Signature +=3D pack("=3DB",ord('B')) + def PACK_NV_STORE_DEFAULT_HEADER(self, size, maxsize): + Signature =3D pack('=3DB', ord('N')) + Signature +=3D pack("=3DB", ord('S')) + Signature +=3D pack("=3DB", ord('D')) + Signature +=3D pack("=3DB", ord('B')) =20 - SizeBuffer =3D pack("=3DL",size) - MaxSizeBuffer =3D pack("=3DQ",maxsize) + SizeBuffer =3D pack("=3DL", size) + MaxSizeBuffer =3D pack("=3DQ", maxsize) =20 return Signature + SizeBuffer + MaxSizeBuffer =20 - def PACK_VARIABLE_HEADER(self,attribute,namesize,datasize,vendorguid): + def PACK_VARIABLE_HEADER(self, attribute, namesize, datasize, vendorgu= id): =20 - Buffer =3D pack('=3DH',0x55AA) # pack StartID - Buffer +=3D pack('=3DB',0x3F) # pack State - Buffer +=3D pack('=3DB',0) # pack reserved + Buffer =3D pack('=3DH', 0x55AA) # pack StartID + Buffer +=3D pack('=3DB', 0x3F) # pack State + Buffer +=3D pack('=3DB', 0) # pack reserved =20 - Buffer +=3D pack('=3DL',attribute) - Buffer +=3D pack('=3DL',namesize) - Buffer +=3D pack('=3DL',datasize) + Buffer +=3D pack('=3DL', attribute) + Buffer +=3D pack('=3DL', namesize) + Buffer +=3D pack('=3DL', datasize) =20 Buffer +=3D PackGUID(vendorguid) =20 @@ -300,63 +300,63 @@ class VariableMgr(object): data_len =3D 0 if data_type =3D=3D "VOID*": for value_char in var_value.strip("{").strip("}").split(","): - Buffer +=3D pack("=3DB",int(value_char,16)) + Buffer +=3D pack("=3DB", int(value_char, 16)) data_len +=3D len(var_value.split(",")) if tail: for value_char in tail.split(","): - Buffer +=3D pack("=3DB",int(value_char,16)) + Buffer +=3D pack("=3DB", int(value_char, 16)) data_len +=3D len(tail.split(",")) elif data_type =3D=3D "BOOLEAN": - Buffer +=3D pack("=3DB",True) if var_value.upper() =3D=3D "TRU= E" else pack("=3DB",False) + Buffer +=3D pack("=3DB", True) if var_value.upper() =3D=3D "TR= UE" else pack("=3DB", False) data_len +=3D 1 elif data_type =3D=3D "UINT8": - Buffer +=3D pack("=3DB",GetIntegerValue(var_value)) + Buffer +=3D pack("=3DB", GetIntegerValue(var_value)) data_len +=3D 1 elif data_type =3D=3D "UINT16": - Buffer +=3D pack("=3DH",GetIntegerValue(var_value)) + Buffer +=3D pack("=3DH", GetIntegerValue(var_value)) data_len +=3D 2 elif data_type =3D=3D "UINT32": - Buffer +=3D pack("=3DL",GetIntegerValue(var_value)) + Buffer +=3D pack("=3DL", GetIntegerValue(var_value)) data_len +=3D 4 elif data_type =3D=3D "UINT64": - Buffer +=3D pack("=3DQ",GetIntegerValue(var_value)) + Buffer +=3D pack("=3DQ", GetIntegerValue(var_value)) data_len +=3D 8 =20 return Buffer =20 - def PACK_DEFAULT_DATA(self, defaultstoragename,skuid,var_value): + def PACK_DEFAULT_DATA(self, defaultstoragename, skuid, var_value): Buffer =3D "" - Buffer +=3D pack("=3DL",4+8+8) - Buffer +=3D pack("=3DQ",int(skuid)) - Buffer +=3D pack("=3DQ",int(defaultstoragename)) + Buffer +=3D pack("=3DL", 4+8+8) + Buffer +=3D pack("=3DQ", int(skuid)) + Buffer +=3D pack("=3DQ", int(defaultstoragename)) =20 for item in var_value: - Buffer +=3D pack("=3DB",item) + Buffer +=3D pack("=3DB", item) =20 - Buffer =3D pack("=3DL",len(Buffer)+4) + Buffer + Buffer =3D pack("=3DL", len(Buffer)+4) + Buffer =20 return Buffer =20 - def GetSkuId(self,skuname): + def GetSkuId(self, skuname): if skuname not in self.SkuIdMap: return None return self.SkuIdMap.get(skuname)[0] - def GetDefaultStoreId(self,dname): + def GetDefaultStoreId(self, dname): if dname not in self.DefaultStoreMap: return None return self.DefaultStoreMap.get(dname)[0] - def PACK_DELTA_DATA(self,skuname,defaultstoragename,delta_list): + def PACK_DELTA_DATA(self, skuname, defaultstoragename, delta_list): skuid =3D self.GetSkuId(skuname) defaultstorageid =3D self.GetDefaultStoreId(defaultstoragename) Buffer =3D "" - Buffer +=3D pack("=3DL",4+8+8) - Buffer +=3D pack("=3DQ",int(skuid)) - Buffer +=3D pack("=3DQ",int(defaultstorageid)) - for (delta_offset,value) in delta_list: - Buffer +=3D pack("=3DL",delta_offset) - Buffer =3D Buffer[:-1] + pack("=3DB",value) + Buffer +=3D pack("=3DL", 4+8+8) + Buffer +=3D pack("=3DQ", int(skuid)) + Buffer +=3D pack("=3DQ", int(defaultstorageid)) + for (delta_offset, value) in delta_list: + Buffer +=3D pack("=3DL", delta_offset) + Buffer =3D Buffer[:-1] + pack("=3DB", value) =20 - Buffer =3D pack("=3DL",len(Buffer) + 4) + Buffer + Buffer =3D pack("=3DL", len(Buffer) + 4) + Buffer =20 return Buffer =20 @@ -364,13 +364,13 @@ class VariableMgr(object): mybuffer =3D data if (len(data) % align) > 0: for i in range(align - (len(data) % align)): - mybuffer +=3D pack("=3DB",0) + mybuffer +=3D pack("=3DB", 0) =20 return mybuffer =20 def PACK_VARIABLE_NAME(self, var_name): Buffer =3D "" for name_char in var_name.strip("{").strip("}").split(","): - Buffer +=3D pack("=3DB",int(name_char,16)) + Buffer +=3D pack("=3DB", int(name_char, 16)) =20 return Buffer diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Pyth= on/BPDG/GenVpd.py index daf11612d83b..1bb37d744ec9 100644 --- a/BaseTools/Source/Python/BPDG/GenVpd.py +++ b/BaseTools/Source/Python/BPDG/GenVpd.py @@ -350,7 +350,7 @@ class GenVPD : # # Enhanced for support "|" character in the string. # - ValueList =3D ['', '', '', '',''] + ValueList =3D ['', '', '', '', ''] =20 ValueRe =3D re.compile(r'\s*L?\".*\|.*\"\s*$') PtrValue =3D ValueRe.findall(line) @@ -400,7 +400,7 @@ class GenVPD : count =3D 0 for line in self.FileLinesList: if line !=3D None : - PCD =3D PcdEntry(line[0], line[1], line[2], line[3], line[= 4],line[5], self.InputFileName) =20 + PCD =3D PcdEntry(line[0], line[1], line[2], line[3], line[= 4], line[5], self.InputFileName) # Strip the space char PCD.PcdCName =3D PCD.PcdCName.strip(' ') PCD.SkuId =3D PCD.SkuId.strip(' ') @@ -514,10 +514,10 @@ class GenVPD : index =3D0 for pcd in self.PcdUnknownOffsetList: index +=3D 1 - if pcd.PcdCName =3D=3D ".".join(("gEfiMdeModulePkgTokenSpaceGu= id","PcdNvStoreDefaultValueBuffer")): + if pcd.PcdCName =3D=3D ".".join(("gEfiMdeModulePkgTokenSpaceGu= id", "PcdNvStoreDefaultValueBuffer")): if index !=3D len(self.PcdUnknownOffsetList): for i in range(len(self.PcdUnknownOffsetList) - index): - self.PcdUnknownOffsetList[index+i -1 ] , self.PcdU= nknownOffsetList[index+i] =3D self.PcdUnknownOffsetList[index+i] , self.Pcd= UnknownOffsetList[index+i -1] + self.PcdUnknownOffsetList[index+i -1 ], self.PcdUn= knownOffsetList[index+i] =3D self.PcdUnknownOffsetList[index+i], self.PcdUn= knownOffsetList[index+i -1] =20 # # Process all Offset value are "*" @@ -598,7 +598,7 @@ class GenVPD : eachUnfixedPcd.PcdOffset =3D str(hex(La= stOffset)) eachUnfixedPcd.PcdBinOffset =3D LastOffset # Insert this pcd into fixed offset pcd li= st. - self.PcdFixedOffsetSizeList.insert(FixOffs= etSizeListCount,eachUnfixedPcd) + self.PcdFixedOffsetSizeList.insert(FixOffs= etSizeListCount, eachUnfixedPcd) =20 # Delete the item's offset that has been f= ixed and added into fixed offset list self.PcdUnknownOffsetList.pop(countOfUnfix= edList) @@ -686,7 +686,7 @@ class GenVPD : for eachPcd in self.PcdFixedOffsetSizeList : # write map file try : - fMapFile.write("%s | %s | %s | %s | %s \n" % (eachPcd.Pcd= CName, eachPcd.SkuId,eachPcd.PcdOffset, eachPcd.PcdSize,eachPcd.PcdUnpackVa= lue)) + fMapFile.write("%s | %s | %s | %s | %s \n" % (eachPcd.Pcd= CName, eachPcd.SkuId, eachPcd.PcdOffset, eachPcd.PcdSize, eachPcd.PcdUnpack= Value)) except: EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE,= "Write data to file %s failed, please check whether the file been locked o= r using by other applications." % self.MapFileName, None) =20 diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/= Python/Common/DataType.py index 0bc2306ea61a..d69908dabfec 100644 --- a/BaseTools/Source/Python/Common/DataType.py +++ b/BaseTools/Source/Python/Common/DataType.py @@ -497,8 +497,8 @@ PCDS_DYNAMICEX_DEFAULT =3D "PcdsDynamicExDefault" PCDS_DYNAMICEX_VPD =3D "PcdsDynamicExVpd" PCDS_DYNAMICEX_HII =3D "PcdsDynamicExHii" =20 -SECTIONS_HAVE_ITEM_PCD =3D [PCDS_DYNAMIC_DEFAULT.upper(),PCDS_DYNAMIC_VPD.= upper(),PCDS_DYNAMIC_HII.upper(), \ - PCDS_DYNAMICEX_DEFAULT.upper(),PCDS_DYNAMICEX_VP= D.upper(),PCDS_DYNAMICEX_HII.upper()] +SECTIONS_HAVE_ITEM_PCD =3D [PCDS_DYNAMIC_DEFAULT.upper(), PCDS_DYNAMIC_VPD= .upper(), PCDS_DYNAMIC_HII.upper(), \ + PCDS_DYNAMICEX_DEFAULT.upper(), PCDS_DYNAMICEX_V= PD.upper(), PCDS_DYNAMICEX_HII.upper()] # Section allowed to have items after arch SECTIONS_HAVE_ITEM_AFTER_ARCH =3D [TAB_LIBRARY_CLASSES.upper(), TAB_DEPEX.= upper(), TAB_USER_EXTENSIONS.upper(), PCDS_DYNAMIC_DEFAULT.upper(), diff --git a/BaseTools/Source/Python/Common/DscClassObject.py b/BaseTools/S= ource/Python/Common/DscClassObject.py index f42d247cad33..e6abc1f036ac 100644 --- a/BaseTools/Source/Python/Common/DscClassObject.py +++ b/BaseTools/Source/Python/Common/DscClassObject.py @@ -1307,7 +1307,7 @@ class Dsc(DscObject): # Parse '!else' # if LineValue.upper().find(TAB_ELSE.upper()) > -1: - Key =3D IfDefList[-1][0].split(' ' , 1)[0].strip() + Key =3D IfDefList[-1][0].split(' ', 1)[0].strip() self.InsertConditionalStatement(Filename, FileID, Mode= l, IfDefList, StartLine, Arch) IfDefList.append((Key, StartLine, MODEL_META_DATA_COND= ITIONAL_STATEMENT_ELSE)) continue diff --git a/BaseTools/Source/Python/Common/EdkIIWorkspace.py b/BaseTools/S= ource/Python/Common/EdkIIWorkspace.py index ed85e4ee0b06..52f63ae53df8 100644 --- a/BaseTools/Source/Python/Common/EdkIIWorkspace.py +++ b/BaseTools/Source/Python/Common/EdkIIWorkspace.py @@ -114,7 +114,7 @@ class EdkIIWorkspace: # @retval string The full path filename # def WorkspaceFile(self, FileName): - return os.path.realpath(mws.join(self.WorkspaceDir,FileName)) + return os.path.realpath(mws.join(self.WorkspaceDir, FileName)) =20 ## Convert to a real path filename # diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Sourc= e/Python/Common/Expression.py index f7dbb29ee882..90ef92a14f41 100644 --- a/BaseTools/Source/Python/Common/Expression.py +++ b/BaseTools/Source/Python/Common/Expression.py @@ -164,7 +164,7 @@ class ValueExpression(object): if Oprand1[0] in ['"', "'"] or Oprand1.startswith('L"') or= Oprand1.startswith("L'")or Oprand1.startswith('UINT'): Oprand1, Size =3D ParseFieldValue(Oprand1) else: - Oprand1,Size =3D ParseFieldValue('"' + Oprand1 + '"') + Oprand1, Size =3D ParseFieldValue('"' + Oprand1 + '"') if type(Oprand2) =3D=3D type(''): if Oprand2[0] in ['"', "'"] or Oprand2.startswith('L"') or= Oprand2.startswith("L'") or Oprand2.startswith('UINT'): Oprand2, Size =3D ParseFieldValue(Oprand2) @@ -493,7 +493,7 @@ class ValueExpression(object): IsArray =3D IsGuid =3D False if len(Token.split(',')) =3D=3D 11 and len(Token.split(',{')) =3D= =3D 2 \ and len(Token.split('},')) =3D=3D 1: - HexLen =3D [11,6,6,5,4,4,4,4,4,4,6] + HexLen =3D [11, 6, 6, 5, 4, 4, 4, 4, 4, 4, 6] HexList=3D Token.split(',') if HexList[3].startswith('{') and \ not [Index for Index, Hex in enumerate(HexList) if len(Hex= ) > HexLen[Index]]: @@ -688,7 +688,7 @@ class ValueExpression(object): # Parse operator def _GetOperator(self): self.__SkipWS() - LegalOpLst =3D ['&&', '||', '!=3D', '=3D=3D', '>=3D', '<=3D'] + se= lf.NonLetterOpLst + ['?',':'] + LegalOpLst =3D ['&&', '||', '!=3D', '=3D=3D', '>=3D', '<=3D'] + se= lf.NonLetterOpLst + ['?', ':'] =20 self._Token =3D '' Expr =3D self._Expr[self._Idx:] diff --git a/BaseTools/Source/Python/Common/FdfParserLite.py b/BaseTools/So= urce/Python/Common/FdfParserLite.py index f2741616c46f..6b7612303730 100644 --- a/BaseTools/Source/Python/Common/FdfParserLite.py +++ b/BaseTools/Source/Python/Common/FdfParserLite.py @@ -2341,7 +2341,7 @@ class FdfParser(object): =20 AlignValue =3D None if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M"= , "16M"): raise Warning("Incorrect alignment '%s'" % self.__Token, s= elf.FileName, self.CurrentLineNumber) AlignValue =3D self.__Token @@ -2610,7 +2610,7 @@ class FdfParser(object): =20 AlignValue =3D None if self.__GetAlignment(): - if self.__Token not in ("8", "16", "32", "64", "128", "512", "= 1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("8", "16", "32", "64", "128", "512", "= 1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M"= , "16M"): raise Warning("Incorrect alignment '%s'" % self.__Token, s= elf.FileName, self.CurrentLineNumber) AlignValue =3D self.__Token @@ -2927,7 +2927,7 @@ class FdfParser(object): =20 AlignValue =3D "" if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M"= , "16M"): raise Warning("Incorrect alignment At Line ", self.FileNam= e, self.CurrentLineNumber) AlignValue =3D self.__Token @@ -2992,7 +2992,7 @@ class FdfParser(object): CheckSum =3D True =20 if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "12= 8", "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "12= 8", "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", = "8M", "16M"): raise Warning("Incorrect alignment At Line ", self.Fil= eName, self.CurrentLineNumber) if self.__Token =3D=3D 'Auto' and (not SectionName =3D=3D = 'PE32') and (not SectionName =3D=3D 'TE'): @@ -3067,7 +3067,7 @@ class FdfParser(object): FvImageSectionObj.FvFileType =3D self.__Token =20 if self.__GetAlignment(): - if self.__Token not in ("8", "16", "32", "64", "128", = "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("8", "16", "32", "64", "128", = "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4= M", "8M", "16M"): raise Warning("Incorrect alignment At Line ", self= .FileName, self.CurrentLineNumber) FvImageSectionObj.Alignment =3D self.__Token @@ -3135,7 +3135,7 @@ class FdfParser(object): EfiSectionObj.BuildNum =3D self.__Token =20 if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M"= , "16M"): raise Warning("Incorrect alignment '%s'" % self.__Token, s= elf.FileName, self.CurrentLineNumber) if self.__Token =3D=3D 'Auto' and (not SectionName =3D=3D 'PE3= 2') and (not SectionName =3D=3D 'TE'): diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Pyth= on/Common/Misc.py index 6878522d59d5..10cb95559822 100644 --- a/BaseTools/Source/Python/Common/Misc.py +++ b/BaseTools/Source/Python/Common/Misc.py @@ -125,7 +125,7 @@ def _parseForGCC(lines, efifilepath, varnames): if Str: m =3D re.match('^([\da-fA-Fx]+) +([\da-fA-Fx]+)', = Str.strip()) if m !=3D None: - varoffset.append((varname, int(m.groups(0)[0],= 16) , int(sections[-1][1], 16), sections[-1][0])) + varoffset.append((varname, int(m.groups(0)[0],= 16), int(sections[-1][1], 16), sections[-1][0])) =20 if not varoffset: return [] @@ -1475,15 +1475,15 @@ def AnalyzePcdExpression(Setting): return FieldList =20 def ParseDevPathValue (Value): - DevPathList =3D [ "Path","HardwarePath","Pci","PcCard","MemoryMapped",= "VenHw","Ctrl","BMC","AcpiPath","Acpi","PciRoot", - "PcieRoot","Floppy","Keyboard","Serial","ParallelPort"= ,"AcpiEx","AcpiExp","AcpiAdr","Msg","Ata","Scsi", - "Fibre","FibreEx","I1394","USB","I2O","Infiniband","Ve= nMsg","VenPcAnsi","VenVt100","VenVt100Plus", - "VenUtf8","UartFlowCtrl","SAS","SasEx","NVMe","UFS","S= D","eMMC","DebugPort","MAC","IPv4","IPv6","Uart", - "UsbClass","UsbAudio","UsbCDCControl","UsbHID","UsbIma= ge","UsbPrinter","UsbMassStorage","UsbHub", - "UsbCDCData","UsbSmartCard","UsbVideo","UsbDiagnostic"= ,"UsbWireless","UsbDeviceFirmwareUpdate", - "UsbIrdaBridge","UsbTestAndMeasurement","UsbWwid","Uni= t","iSCSI","Vlan","Uri","Bluetooth","Wi-Fi", - "MediaPath","HD","CDROM","VenMedia","Media","Fv","FvFi= le","Offset","RamDisk","VirtualDisk","VirtualCD", - "PersistentVirtualDisk","PersistentVirtualCD","BbsPath= ","BBS","Sata" ] + DevPathList =3D [ "Path", "HardwarePath", "Pci", "PcCard", "MemoryMapp= ed", "VenHw", "Ctrl", "BMC", "AcpiPath", "Acpi", "PciRoot", + "PcieRoot", "Floppy", "Keyboard", "Serial", "ParallelP= ort", "AcpiEx", "AcpiExp", "AcpiAdr", "Msg", "Ata", "Scsi", + "Fibre", "FibreEx", "I1394", "USB", "I2O", "Infiniband= ", "VenMsg", "VenPcAnsi", "VenVt100", "VenVt100Plus", + "VenUtf8", "UartFlowCtrl", "SAS", "SasEx", "NVMe", "UF= S", "SD", "eMMC", "DebugPort", "MAC", "IPv4", "IPv6", "Uart", + "UsbClass", "UsbAudio", "UsbCDCControl", "UsbHID", "Us= bImage", "UsbPrinter", "UsbMassStorage", "UsbHub", + "UsbCDCData", "UsbSmartCard", "UsbVideo", "UsbDiagnost= ic", "UsbWireless", "UsbDeviceFirmwareUpdate", + "UsbIrdaBridge", "UsbTestAndMeasurement", "UsbWwid", "= Unit", "iSCSI", "Vlan", "Uri", "Bluetooth", "Wi-Fi", + "MediaPath", "HD", "CDROM", "VenMedia", "Media", "Fv",= "FvFile", "Offset", "RamDisk", "VirtualDisk", "VirtualCD", + "PersistentVirtualDisk", "PersistentVirtualCD", "BbsPa= th", "BBS", "Sata" ] if '\\' in Value: Value.replace('\\', '/').replace(' ', '') for Item in Value.split('/'): @@ -1665,7 +1665,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=3D''): # Value, Size =3D ParseFieldValue(Value) if Size: try: - int(Size,16) if Size.upper().startswith("0X") else int(Siz= e) + int(Size, 16) if Size.upper().startswith("0X") else int(Si= ze) except: IsValid =3D False Size =3D -1 @@ -1694,7 +1694,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=3D''): =20 if Size: try: - int(Size,16) if Size.upper().startswith("0X") else int(Siz= e) + int(Size, 16) if Size.upper().startswith("0X") else int(Si= ze) except: IsValid =3D False Size =3D -1 @@ -1716,7 +1716,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=3D''): IsValid =3D (len(FieldList) <=3D 3) if Size: try: - int(Size,16) if Size.upper().startswith("0X") else int(Siz= e) + int(Size, 16) if Size.upper().startswith("0X") else int(Si= ze) except: IsValid =3D False Size =3D -1 @@ -1920,7 +1920,7 @@ def ConvertStringToByteArray(Value): =20 Value =3D eval(Value) # translate escape character NewValue =3D '{' - for Index in range(0,len(Value)): + for Index in range(0, len(Value)): if Unicode: NewValue =3D NewValue + str(ord(Value[Index]) % 0x10000) + ',' else: @@ -2164,28 +2164,28 @@ class PeImageClass(): return Value =20 class DefaultStore(): - def __init__(self,DefaultStores ): + def __init__(self, DefaultStores): =20 self.DefaultStores =3D DefaultStores - def DefaultStoreID(self,DefaultStoreName): - for key,value in self.DefaultStores.items(): + def DefaultStoreID(self, DefaultStoreName): + for key, value in self.DefaultStores.items(): if value =3D=3D DefaultStoreName: return key return None def GetDefaultDefault(self): if not self.DefaultStores or "0" in self.DefaultStores: - return "0",TAB_DEFAULT_STORES_DEFAULT + return "0", TAB_DEFAULT_STORES_DEFAULT else: minvalue =3D min([int(value_str) for value_str in self.Default= Stores.keys()]) return (str(minvalue), self.DefaultStores[str(minvalue)]) - def GetMin(self,DefaultSIdList): + def GetMin(self, DefaultSIdList): if not DefaultSIdList: return "STANDARD" storeidset =3D {storeid for storeid, storename in self.DefaultStor= es.values() if storename in DefaultSIdList} if not storeidset: return "" minid =3D min(storeidset ) - for sid,name in self.DefaultStores.values(): + for sid, name in self.DefaultStores.values(): if sid =3D=3D minid: return name class SkuClass(): @@ -2200,7 +2200,7 @@ class SkuClass(): =20 for SkuName in SkuIds: SkuId =3D SkuIds[SkuName][0] - skuid_num =3D int(SkuId,16) if SkuId.upper().startswith("0X") = else int(SkuId) + skuid_num =3D int(SkuId, 16) if SkuId.upper().startswith("0X")= else int(SkuId) if skuid_num > 0xFFFFFFFFFFFFFFFF: EdkLogger.error("build", PARAMETER_INVALID, ExtraData =3D "SKU-ID [%s] value %s exceeds th= e max value of UINT64" @@ -2249,9 +2249,9 @@ class SkuClass(): self.__SkuInherit =3D {} for item in self.SkuData.values(): self.__SkuInherit[item[1]]=3Ditem[2] if item[2] else "DEFA= ULT" - return self.__SkuInherit.get(skuname,"DEFAULT") + return self.__SkuInherit.get(skuname, "DEFAULT") =20 - def GetSkuChain(self,sku): + def GetSkuChain(self, sku): if sku =3D=3D "DEFAULT": return ["DEFAULT"] skulist =3D [sku] diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/= Source/Python/Common/RangeExpression.py index 4357f240f423..496961554e87 100644 --- a/BaseTools/Source/Python/Common/RangeExpression.py +++ b/BaseTools/Source/Python/Common/RangeExpression.py @@ -176,7 +176,7 @@ class EQOperatorObject(object): raise BadExpression(ERR_SNYTAX % Expr) rangeId1 =3D str(uuid.uuid1()) rangeContainer =3D RangeContainer() - rangeContainer.push(RangeObject(int(Operand) , int(Operand))) + rangeContainer.push(RangeObject(int(Operand), int(Operand))) SymbolTable[rangeId1] =3D rangeContainer return rangeId1 =20 =20 @@ -473,7 +473,7 @@ class RangeExpression(object): =20 # [!]*A def _RelExpr(self): - if self._IsOperator(["NOT" , "LE", "GE", "LT", "GT", "EQ", "XOR"]): + if self._IsOperator(["NOT", "LE", "GE", "LT", "GT", "EQ", "XOR"]): Token =3D self._Token Val =3D self._NeExpr() try: diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Py= thon/Common/String.py index e6c7a3b74ee1..358e7b8d7c31 100644 --- a/BaseTools/Source/Python/Common/String.py +++ b/BaseTools/Source/Python/Common/String.py @@ -739,7 +739,7 @@ def SplitString(String): # @param StringList: A list for strings to be converted # def ConvertToSqlString(StringList): - return map(lambda s: s.replace("'", "''") , StringList) + return map(lambda s: s.replace("'", "''"), StringList) =20 ## Convert To Sql String # diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Sour= ce/Python/Common/VpdInfoFile.py index 84dd7ac563dd..d59697c64b68 100644 --- a/BaseTools/Source/Python/Common/VpdInfoFile.py +++ b/BaseTools/Source/Python/Common/VpdInfoFile.py @@ -89,7 +89,7 @@ class VpdInfoFile: # # @param offset integer value for VPD's offset in specific SKU. # - def Add(self, Vpd, skuname,Offset): + def Add(self, Vpd, skuname, Offset): if (Vpd =3D=3D None): EdkLogger.error("VpdInfoFile", BuildToolError.ATTRIBUTE_UNKNOW= N_ERROR, "Invalid VPD PCD entry.") =20 @@ -141,7 +141,7 @@ class VpdInfoFile: if PcdValue =3D=3D "" : PcdValue =3D Pcd.DefaultValue =20 - Content +=3D "%s.%s|%s|%s|%s|%s \n" % (Pcd.TokenSpaceGuid= CName, PcdTokenCName, skuname,str(self._VpdArray[Pcd][skuname]).strip(), st= r(Pcd.MaxDatumSize).strip(),PcdValue) + Content +=3D "%s.%s|%s|%s|%s|%s \n" % (Pcd.TokenSpaceGuid= CName, PcdTokenCName, skuname, str(self._VpdArray[Pcd][skuname]).strip(), s= tr(Pcd.MaxDatumSize).strip(), PcdValue) i +=3D 1 =20 return SaveFileOnChange(FilePath, Content, False) @@ -170,8 +170,8 @@ class VpdInfoFile: # the line must follow output format defined in BPDG spec. # try: - PcdName, SkuId,Offset, Size, Value =3D Line.split("#")[0].= split("|") - PcdName, SkuId,Offset, Size, Value =3D PcdName.strip(), Sk= uId.strip(),Offset.strip(), Size.strip(), Value.strip() + PcdName, SkuId, Offset, Size, Value =3D Line.split("#")[0]= .split("|") + PcdName, SkuId, Offset, Size, Value =3D PcdName.strip(), S= kuId.strip(), Offset.strip(), Size.strip(), Value.strip() TokenSpaceName, PcdTokenName =3D PcdName.split(".") except: EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR, "Fail= to parse VPD information file %s" % FilePath) @@ -180,7 +180,7 @@ class VpdInfoFile: =20 if (TokenSpaceName, PcdTokenName) not in self._VpdInfo: self._VpdInfo[(TokenSpaceName, PcdTokenName)] =3D [] - self._VpdInfo[(TokenSpaceName, PcdTokenName)].append((SkuId,Of= fset, Value)) + self._VpdInfo[(TokenSpaceName, PcdTokenName)].append((SkuId, O= ffset, Value)) for VpdObject in self._VpdArray.keys(): VpdObjectTokenCName =3D VpdObject.TokenCName for PcdItem in GlobalData.MixedPcd: diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Pyth= on/Ecc/CParser.py index 2df8fc3e0c26..bd4f10e1edff 100644 --- a/BaseTools/Source/Python/Ecc/CParser.py +++ b/BaseTools/Source/Python/Ecc/CParser.py @@ -785,10 +785,10 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if d !=3D None: - self.function_definition_stack[-1].ModifierText =3D = self.input.toString(d.start,d.stop) + self.function_definition_stack[-1].ModifierText =3D = self.input.toString(d.start, d.stop) else: self.function_definition_stack[-1].ModifierText =3D = '' - self.function_definition_stack[-1].DeclText =3D self.i= nput.toString(declarator1.start,declarator1.stop) + self.function_definition_stack[-1].DeclText =3D self.i= nput.toString(declarator1.start, declarator1.stop) self.function_definition_stack[-1].DeclLine =3D declar= ator1.start.line self.function_definition_stack[-1].DeclOffset =3D decl= arator1.start.charPositionInLine if a !=3D None: @@ -922,9 +922,9 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if b !=3D None: - self.StoreTypedefDefinition(a.line, a.charPositi= onInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop)= , self.input.toString(c.start,c.stop)) + self.StoreTypedefDefinition(a.line, a.charPositi= onInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop= ), self.input.toString(c.start, c.stop)) else: - self.StoreTypedefDefinition(a.line, a.charPositi= onInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.s= top)) + self.StoreTypedefDefinition(a.line, a.charPositi= onInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.= stop)) =20 =20 =20 @@ -959,7 +959,7 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if t !=3D None: - self.StoreVariableDeclaration(s.start.line, s.st= art.charPositionInLine, t.start.line, t.start.charPositionInLine, self.inpu= t.toString(s.start,s.stop), self.input.toString(t.start,t.stop)) + self.StoreVariableDeclaration(s.start.line, s.st= art.charPositionInLine, t.start.line, t.start.charPositionInLine, self.inpu= t.toString(s.start, s.stop), self.input.toString(t.start, t.stop)) =09 =20 =20 @@ -1403,7 +1403,7 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if s.stop !=3D None: - self.StoreStructUnionDefinition(s.start.line, s.= start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.inpu= t.toString(s.start,s.stop)) + self.StoreStructUnionDefinition(s.start.line, s.= start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.inpu= t.toString(s.start, s.stop)) =09 =20 =20 @@ -1418,7 +1418,7 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if e.stop !=3D None: - self.StoreEnumerationDefinition(e.start.line, e.= start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.inpu= t.toString(e.start,e.stop)) + self.StoreEnumerationDefinition(e.start.line, e.= start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.inpu= t.toString(e.start, e.stop)) =09 =20 =20 @@ -5401,7 +5401,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.postfix_expression_stack[-1].FuncCallText +=3D se= lf.input.toString(p.start,p.stop) + self.postfix_expression_stack[-1].FuncCallText +=3D se= lf.input.toString(p.start, p.stop) =20 # C.g:407:9: ( '[' expression ']' | '(' a=3D ')' | '(' c= =3D argument_expression_list b=3D ')' | '(' macro_parameter_list ')' | '.' = x=3D IDENTIFIER | '*' y=3D IDENTIFIER | '->' z=3D IDENTIFIER | '++' | '--' = )* while True: #loop65 @@ -5501,7 +5501,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StoreFunctionCalling(p.start.line, p.star= t.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression= _stack[-1].FuncCallText, self.input.toString(c.start,c.stop)) + self.StoreFunctionCalling(p.start.line, p.star= t.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression= _stack[-1].FuncCallText, self.input.toString(c.start, c.stop)) =20 =20 =20 @@ -8277,7 +8277,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 =20 =20 @@ -16384,7 +16384,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 self.following.append(self.FOLLOW_statement_in_selecti= on_statement2284) self.statement() @@ -16503,7 +16503,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 =20 =20 @@ -16535,7 +16535,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 =20 =20 @@ -16582,7 +16582,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 =20 =20 diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.p= y b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py index e04b67732141..145c7435cd12 100644 --- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py +++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py @@ -562,7 +562,7 @@ class InfParser(MetaFileParser): NmakeLine =3D '' =20 # section content - self._ValueList =3D ['','',''] + self._ValueList =3D ['', '', ''] # parse current line, result will be put in self._ValueList self._SectionParser[self._SectionType](self) if self._ValueList =3D=3D None or self._ItemType =3D=3D MODEL_= META_DATA_DEFINE: @@ -921,7 +921,7 @@ class DscParser(MetaFileParser): =20 ## Directive statement parser def _DirectiveParser(self): - self._ValueList =3D ['','',''] + self._ValueList =3D ['', '', ''] TokenList =3D GetSplitValueList(self._CurrentLine, ' ', 1) self._ValueList[0:len(TokenList)] =3D TokenList =20 @@ -1111,7 +1111,7 @@ class DscParser(MetaFileParser): =20 ## Override parent's method since we'll do all macro replacements in p= arser def _GetMacros(self): - Macros =3D dict( [('ARCH','IA32'), ('FAMILY','MSFT'),('TOOL_CHAIN_= TAG','VS2008x86'),('TARGET','DEBUG')]) + Macros =3D dict( [('ARCH', 'IA32'), ('FAMILY', 'MSFT'), ('TOOL_CHA= IN_TAG', 'VS2008x86'), ('TARGET', 'DEBUG')]) Macros.update(self._FileLocalMacros) Macros.update(self._GetApplicableSectionMacro()) Macros.update(GlobalData.gEdkGlobal) @@ -1226,7 +1226,7 @@ class DscParser(MetaFileParser): self._RawTable.Drop() self._Table.Drop() for Record in RecordList: - EccGlobalData.gDb.TblDsc.Insert(Record[1],Record[2],Record[3],= Record[4],Record[5],Record[6],Record[7],Record[8],Record[9],Record[10],Reco= rd[11],Record[12],Record[13],Record[14]) + EccGlobalData.gDb.TblDsc.Insert(Record[1], Record[2], Record[3= ], Record[4], Record[5], Record[6], Record[7], Record[8], Record[9], Record= [10], Record[11], Record[12], Record[13], Record[14]) GlobalData.gPlatformDefines.update(self._FileLocalMacros) self._PostProcessed =3D True self._Content =3D None @@ -1247,7 +1247,7 @@ class DscParser(MetaFileParser): =20 def __RetrievePcdValue(self): Records =3D self._RawTable.Query(MODEL_PCD_FEATURE_FLAG, BelongsTo= Item=3D-1.0) - for TokenSpaceGuid,PcdName,Value,Dummy2,Dummy3,ID,Line in Records: + for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Re= cords: Value, DatumType, MaxDatumSize =3D AnalyzePcdData(Value) # Only use PCD whose value is straitforward (no macro and PCD) if self.SymbolPattern.findall(Value): @@ -1572,7 +1572,7 @@ class DecParser(MetaFileParser): continue =20 # section content - self._ValueList =3D ['','',''] + self._ValueList =3D ['', '', ''] self._SectionParser[self._SectionType[0]](self) if self._ValueList =3D=3D None or self._ItemType =3D=3D MODEL_= META_DATA_DEFINE: self._ItemType =3D -1 @@ -1718,7 +1718,7 @@ class DecParser(MetaFileParser): GuidValue =3D GuidValue.lstrip(' {') HexList.append('0x' + str(GuidValue[2:])) Index +=3D 1 - self._ValueList[1] =3D "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s= , %s, %s }}" % (HexList[0], HexList[1], HexList[2],HexList[3],HexList[4],He= xList[5],HexList[6],HexList[7],HexList[8],HexList[9],HexList[10]) + self._ValueList[1] =3D "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s= , %s, %s }}" % (HexList[0], HexList[1], HexList[2], HexList[3], HexList[4],= HexList[5], HexList[6], HexList[7], HexList[8], HexList[9], HexList[10]) else: EdkLogger.error('Parser', FORMAT_INVALID, "Invalid GUID value = format", ExtraData=3Dself._CurrentLine + \ diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Pyth= on/Eot/CParser.py index 2df8fc3e0c26..bd4f10e1edff 100644 --- a/BaseTools/Source/Python/Eot/CParser.py +++ b/BaseTools/Source/Python/Eot/CParser.py @@ -785,10 +785,10 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if d !=3D None: - self.function_definition_stack[-1].ModifierText =3D = self.input.toString(d.start,d.stop) + self.function_definition_stack[-1].ModifierText =3D = self.input.toString(d.start, d.stop) else: self.function_definition_stack[-1].ModifierText =3D = '' - self.function_definition_stack[-1].DeclText =3D self.i= nput.toString(declarator1.start,declarator1.stop) + self.function_definition_stack[-1].DeclText =3D self.i= nput.toString(declarator1.start, declarator1.stop) self.function_definition_stack[-1].DeclLine =3D declar= ator1.start.line self.function_definition_stack[-1].DeclOffset =3D decl= arator1.start.charPositionInLine if a !=3D None: @@ -922,9 +922,9 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if b !=3D None: - self.StoreTypedefDefinition(a.line, a.charPositi= onInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop)= , self.input.toString(c.start,c.stop)) + self.StoreTypedefDefinition(a.line, a.charPositi= onInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop= ), self.input.toString(c.start, c.stop)) else: - self.StoreTypedefDefinition(a.line, a.charPositi= onInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.s= top)) + self.StoreTypedefDefinition(a.line, a.charPositi= onInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.= stop)) =20 =20 =20 @@ -959,7 +959,7 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if t !=3D None: - self.StoreVariableDeclaration(s.start.line, s.st= art.charPositionInLine, t.start.line, t.start.charPositionInLine, self.inpu= t.toString(s.start,s.stop), self.input.toString(t.start,t.stop)) + self.StoreVariableDeclaration(s.start.line, s.st= art.charPositionInLine, t.start.line, t.start.charPositionInLine, self.inpu= t.toString(s.start, s.stop), self.input.toString(t.start, t.stop)) =09 =20 =20 @@ -1403,7 +1403,7 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if s.stop !=3D None: - self.StoreStructUnionDefinition(s.start.line, s.= start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.inpu= t.toString(s.start,s.stop)) + self.StoreStructUnionDefinition(s.start.line, s.= start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.inpu= t.toString(s.start, s.stop)) =09 =20 =20 @@ -1418,7 +1418,7 @@ class CParser(Parser): if self.backtracking =3D=3D 0: =20 if e.stop !=3D None: - self.StoreEnumerationDefinition(e.start.line, e.= start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.inpu= t.toString(e.start,e.stop)) + self.StoreEnumerationDefinition(e.start.line, e.= start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.inpu= t.toString(e.start, e.stop)) =09 =20 =20 @@ -5401,7 +5401,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.postfix_expression_stack[-1].FuncCallText +=3D se= lf.input.toString(p.start,p.stop) + self.postfix_expression_stack[-1].FuncCallText +=3D se= lf.input.toString(p.start, p.stop) =20 # C.g:407:9: ( '[' expression ']' | '(' a=3D ')' | '(' c= =3D argument_expression_list b=3D ')' | '(' macro_parameter_list ')' | '.' = x=3D IDENTIFIER | '*' y=3D IDENTIFIER | '->' z=3D IDENTIFIER | '++' | '--' = )* while True: #loop65 @@ -5501,7 +5501,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StoreFunctionCalling(p.start.line, p.star= t.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression= _stack[-1].FuncCallText, self.input.toString(c.start,c.stop)) + self.StoreFunctionCalling(p.start.line, p.star= t.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression= _stack[-1].FuncCallText, self.input.toString(c.start, c.stop)) =20 =20 =20 @@ -8277,7 +8277,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 =20 =20 @@ -16384,7 +16384,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 self.following.append(self.FOLLOW_statement_in_selecti= on_statement2284) self.statement() @@ -16503,7 +16503,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 =20 =20 @@ -16535,7 +16535,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 =20 =20 @@ -16582,7 +16582,7 @@ class CParser(Parser): if self.failed: return=20 if self.backtracking =3D=3D 0: - self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start,e.stop)) + self.StorePredicateExpression(e.start.line, e.star= t.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.to= String(e.start, e.stop)) =20 =20 =20 diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot= /c.py index c70f62f393a9..ceefc952237f 100644 --- a/BaseTools/Source/Python/Eot/c.py +++ b/BaseTools/Source/Python/Eot/c.py @@ -128,11 +128,11 @@ def GetIdentifierList(): =20 for pp in FileProfile.PPDirectiveList: Type =3D GetIdType(pp.Content) - IdPP =3D DataClass.IdentifierClass(-1, '', '', '', pp.Content, Typ= e, -1, -1, pp.StartPos[0],pp.StartPos[1],pp.EndPos[0],pp.EndPos[1]) + IdPP =3D DataClass.IdentifierClass(-1, '', '', '', pp.Content, Typ= e, -1, -1, pp.StartPos[0], pp.StartPos[1], pp.EndPos[0], pp.EndPos[1]) IdList.append(IdPP) =20 for ae in FileProfile.AssignmentExpressionList: - IdAE =3D DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, a= e.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -1, -1, ae.Start= Pos[0],ae.StartPos[1],ae.EndPos[0],ae.EndPos[1]) + IdAE =3D DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, a= e.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -1, -1, ae.Start= Pos[0], ae.StartPos[1], ae.EndPos[0], ae.EndPos[1]) IdList.append(IdAE) =20 FuncDeclPattern =3D GetFuncDeclPattern() @@ -154,7 +154,7 @@ def GetIdentifierList(): var.Modifier +=3D ' ' + FuncNamePartList[Index] var.Declarator =3D var.Declarator.lstrip().lstrip(Func= NamePartList[Index]) Index +=3D 1 - IdVar =3D DataClass.IdentifierClass(-1, var.Modifier, '', var.= Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, va= r.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1]) + IdVar =3D DataClass.IdentifierClass(-1, var.Modifier, '', var.= Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, va= r.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1]) IdList.append(IdVar) continue =20 @@ -167,7 +167,7 @@ def GetIdentifierList(): var.Modifier +=3D ' ' + Name[LSBPos:] Name =3D Name[0:LSBPos] =20 - IdVar =3D DataClass.IdentifierClass(-1, var.Modifier, '', = Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDEN= TIFIER_VARIABLE, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.= EndPos[1]) + IdVar =3D DataClass.IdentifierClass(-1, var.Modifier, '', = Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDEN= TIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], v= ar.EndPos[1]) IdList.append(IdVar) else: DeclList =3D var.Declarator.split('=3D') @@ -176,7 +176,7 @@ def GetIdentifierList(): LSBPos =3D var.Declarator.find('[') var.Modifier +=3D ' ' + Name[LSBPos:] Name =3D Name[0:LSBPos] - IdVar =3D DataClass.IdentifierClass(-1, var.Modifier, '', Name= , (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFI= ER_VARIABLE, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndP= os[1]) + IdVar =3D DataClass.IdentifierClass(-1, var.Modifier, '', Name= , (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFI= ER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.E= ndPos[1]) IdList.append(IdVar) =20 for enum in FileProfile.EnumerationDefinitionList: @@ -184,7 +184,7 @@ def GetIdentifierList(): RBPos =3D enum.Content.find('}') Name =3D enum.Content[4:LBPos].strip() Value =3D enum.Content[LBPos+1:RBPos] - IdEnum =3D DataClass.IdentifierClass(-1, '', '', Name, Value, Data= Class.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0],enum.StartPos[1]= ,enum.EndPos[0],enum.EndPos[1]) + IdEnum =3D DataClass.IdentifierClass(-1, '', '', Name, Value, Data= Class.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0], enum.StartPos[1= ], enum.EndPos[0], enum.EndPos[1]) IdList.append(IdEnum) =20 for su in FileProfile.StructUnionDefinitionList: @@ -201,7 +201,7 @@ def GetIdentifierList(): else: Name =3D su.Content[SkipLen:LBPos].strip() Value =3D su.Content[LBPos+1:RBPos] - IdPE =3D DataClass.IdentifierClass(-1, '', '', Name, Value, Type, = -1, -1, su.StartPos[0],su.StartPos[1],su.EndPos[0],su.EndPos[1]) + IdPE =3D DataClass.IdentifierClass(-1, '', '', Name, Value, Type, = -1, -1, su.StartPos[0], su.StartPos[1], su.EndPos[0], su.EndPos[1]) IdList.append(IdPE) =20 TdFuncPointerPattern =3D GetTypedefFuncPointerPattern() @@ -224,11 +224,11 @@ def GetIdentifierList(): Name =3D TmpStr[0:RBPos] Value =3D 'FP' + TmpStr[RBPos + 1:] =20 - IdTd =3D DataClass.IdentifierClass(-1, Modifier, '', Name, Value, = DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0],td.StartPos[1],t= d.EndPos[0],td.EndPos[1]) + IdTd =3D DataClass.IdentifierClass(-1, Modifier, '', Name, Value, = DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0], td.StartPos[1],= td.EndPos[0], td.EndPos[1]) IdList.append(IdTd) =20 for funcCall in FileProfile.FunctionCallingList: - IdFC =3D DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, = funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, fu= ncCall.StartPos[0],funcCall.StartPos[1],funcCall.EndPos[0],funcCall.EndPos[= 1]) + IdFC =3D DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, = funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, fu= ncCall.StartPos[0], funcCall.StartPos[1], funcCall.EndPos[0], funcCall.EndP= os[1]) IdList.append(IdFC) return IdList =20 @@ -330,7 +330,7 @@ def GetFunctionList(): FuncDef.Modifier +=3D ' ' + FuncNamePartList[Index] Index +=3D 1 =20 - FuncObj =3D DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDe= f.Modifier, FuncName.strip(), '', FuncDef.StartPos[0],FuncDef.StartPos[1],F= uncDef.EndPos[0],FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBr= acePos[1], -1, ParamIdList, []) + FuncObj =3D DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDe= f.Modifier, FuncName.strip(), '', FuncDef.StartPos[0], FuncDef.StartPos[1],= FuncDef.EndPos[0], FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.Lef= tBracePos[1], -1, ParamIdList, []) FuncObjList.append(FuncObj) =20 return FuncObjList diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/S= ource/Python/GenFds/AprioriSection.py index 27fe2619a35f..b678079b3785 100644 --- a/BaseTools/Source/Python/GenFds/AprioriSection.py +++ b/BaseTools/Source/Python/GenFds/AprioriSection.py @@ -23,7 +23,7 @@ import FfsFileStatement from GenFdsGlobalVariable import GenFdsGlobalVariable from CommonDataClass.FdfClass import AprioriSectionClassObject from Common.String import * -from Common.Misc import SaveFileOnChange,PathClass +from Common.Misc import SaveFileOnChange, PathClass from Common import EdkLogger from Common.BuildToolError import * =20 diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Sour= ce/Python/GenFds/CapsuleData.py index 5b806d9e4482..1fa202149b25 100644 --- a/BaseTools/Source/Python/GenFds/CapsuleData.py +++ b/BaseTools/Source/Python/GenFds/CapsuleData.py @@ -207,7 +207,7 @@ class CapsulePayload(CapsuleData): # Guid =3D self.ImageTypeId.split('-') Buffer =3D pack('=3DILHHBBBBBBBBBBBBIIQ', - int(self.Version,16), + int(self.Version, 16), int(Guid[0], 16),=20 int(Guid[1], 16),=20 int(Guid[2], 16),=20 diff --git a/BaseTools/Source/Python/GenFds/EfiSection.py b/BaseTools/Sourc= e/Python/GenFds/EfiSection.py index 5029ec7a1823..d24df30cb734 100644 --- a/BaseTools/Source/Python/GenFds/EfiSection.py +++ b/BaseTools/Source/Python/GenFds/EfiSection.py @@ -130,7 +130,7 @@ class EfiSection (EfiSectionClassObject): elif FileList !=3D []: for File in FileList: Index =3D Index + 1 - Num =3D '%s.%d' %(SecNum , Index) + Num =3D '%s.%d' %(SecNum, Index) OutputFile =3D os.path.join(OutputPath, ModuleName + '= SEC' + Num + Ffs.SectionSuffix.get(SectionType)) f =3D open(File, 'r') VerString =3D f.read() @@ -187,7 +187,7 @@ class EfiSection (EfiSectionClassObject): elif FileList !=3D []: for File in FileList: Index =3D Index + 1 - Num =3D '%s.%d' %(SecNum , Index) + Num =3D '%s.%d' %(SecNum, Index) OutputFile =3D os.path.join(OutputPath, ModuleName + '= SEC' + Num + Ffs.SectionSuffix.get(SectionType)) f =3D open(File, 'r') UiString =3D f.read() @@ -228,7 +228,7 @@ class EfiSection (EfiSectionClassObject): for File in FileList: """ Copy Map file to FFS output path """ Index =3D Index + 1 - Num =3D '%s.%d' %(SecNum , Index) + Num =3D '%s.%d' %(SecNum, Index) OutputFile =3D os.path.join( OutputPath, ModuleName + = 'SEC' + Num + Ffs.SectionSuffix.get(SectionType)) File =3D GenFdsGlobalVariable.MacroExtend(File, Dict) =20 diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python= /GenFds/Fd.py index f735d3b5b015..21060625217e 100644 --- a/BaseTools/Source/Python/GenFds/Fd.py +++ b/BaseTools/Source/Python/GenFds/Fd.py @@ -136,7 +136,7 @@ class FD(FDClassObject): # Call each region's AddToBuffer function # GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToB= uffer function') - RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockS= izeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.Def= ineVarDict,Flag=3DFlag) + RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockS= izeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.Def= ineVarDict, Flag=3DFlag) # # Write the buffer contents to Fd file # @@ -162,7 +162,7 @@ class FD(FDClassObject): if len(RegionObj.RegionDataList) =3D=3D 1: RegionData =3D RegionObj.RegionDataList[0] FvList.append(RegionData.upper()) - FvAddDict[RegionData.upper()] =3D (int(self.BaseAddres= s,16) + \ + FvAddDict[RegionData.upper()] =3D (int(self.BaseAddres= s, 16) + \ RegionObj.Offset, RegionOb= j.Size) else: Offset =3D RegionObj.Offset @@ -177,7 +177,7 @@ class FD(FDClassObject): Size =3D 0 for blockStatement in FvObj.BlockSizeList: Size =3D Size + blockStatement[0] * blockS= tatement[1] - FvAddDict[RegionData.upper()] =3D (int(self.Ba= seAddress,16) + \ + FvAddDict[RegionData.upper()] =3D (int(self.Ba= seAddress, 16) + \ Offset, Size) Offset =3D Offset + Size # diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source= /Python/GenFds/FdfParser.py index d4ba485bcdff..43f849b07172 100644 --- a/BaseTools/Source/Python/GenFds/FdfParser.py +++ b/BaseTools/Source/Python/GenFds/FdfParser.py @@ -1855,7 +1855,7 @@ class FdfParser: return long( ValueExpression(Expr, self.__CollectMacroPcd() - )(True),0) + )(True), 0) except Exception: self.SetFileBufferPos(StartPos) return None @@ -2768,7 +2768,7 @@ class FdfParser: while True: AlignValue =3D None if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "12= 8", "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "12= 8", "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", = "8M", "16M"): raise Warning("Incorrect alignment '%s'" % self.__Toke= n, self.FileName, self.CurrentLineNumber) #For FFS, Auto is default option same to "" @@ -2828,7 +2828,7 @@ class FdfParser: FfsFileObj.CheckSum =3D True =20 if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M"= , "16M"): raise Warning("Incorrect alignment '%s'" % self.__Token, s= elf.FileName, self.CurrentLineNumber) #For FFS, Auto is default option same to "" @@ -2900,7 +2900,7 @@ class FdfParser: =20 AlignValue =3D None if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M"= , "16M"): raise Warning("Incorrect alignment '%s'" % self.__Token, s= elf.FileName, self.CurrentLineNumber) AlignValue =3D self.__Token @@ -3190,7 +3190,7 @@ class FdfParser: =20 AlignValue =3D None if self.__GetAlignment(): - if self.__Token not in ("8", "16", "32", "64", "128", "512", "= 1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("8", "16", "32", "64", "128", "512", "= 1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M"= , "16M"): raise Warning("Incorrect alignment '%s'" % self.__Token, s= elf.FileName, self.CurrentLineNumber) AlignValue =3D self.__Token @@ -3583,7 +3583,7 @@ class FdfParser: AfileName =3D self.__Token AfileBaseName =3D os.path.basename(AfileName) =20 - if os.path.splitext(AfileBaseName)[1] not in [".bin",".BIN",".Bin= ",".dat",".DAT",".Dat",".data",".DATA",".Data"]: + if os.path.splitext(AfileBaseName)[1] not in [".bin", ".BIN", ".B= in", ".dat", ".DAT", ".Dat", ".data", ".DATA", ".Data"]: raise Warning('invalid binary file type, should be one of "bin= ","BIN","Bin","dat","DAT","Dat","data","DATA","Data"', \ self.FileName, self.CurrentLineNumber) =20 @@ -3782,7 +3782,7 @@ class FdfParser: =20 AlignValue =3D "" if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M"= , "16M"): raise Warning("Incorrect alignment '%s'" % self.__Token, s= elf.FileName, self.CurrentLineNumber) #For FFS, Auto is default option same to "" @@ -3832,7 +3832,7 @@ class FdfParser: =20 SectAlignment =3D "" if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "12= 8", "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "12= 8", "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", = "8M", "16M"): raise Warning("Incorrect alignment '%s'" % self.__Toke= n, self.FileName, self.CurrentLineNumber) if self.__Token =3D=3D 'Auto' and (not SectionName =3D=3D = 'PE32') and (not SectionName =3D=3D 'TE'): @@ -3912,7 +3912,7 @@ class FdfParser: FvImageSectionObj.FvFileType =3D self.__Token =20 if self.__GetAlignment(): - if self.__Token not in ("8", "16", "32", "64", "128", = "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("8", "16", "32", "64", "128", = "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4= M", "8M", "16M"): raise Warning("Incorrect alignment '%s'" % self.__= Token, self.FileName, self.CurrentLineNumber) FvImageSectionObj.Alignment =3D self.__Token @@ -3980,7 +3980,7 @@ class FdfParser: EfiSectionObj.BuildNum =3D self.__Token =20 if self.__GetAlignment(): - if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K" ,"64K", "128K", + if self.__Token not in ("Auto", "8", "16", "32", "64", "128", = "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M"= , "16M"): raise Warning("Incorrect alignment '%s'" % self.__Token, s= elf.FileName, self.CurrentLineNumber) if self.__Token =3D=3D 'Auto' and (not SectionName =3D=3D 'PE3= 2') and (not SectionName =3D=3D 'TE'): @@ -4720,7 +4720,7 @@ class FdfParser: FvInFdList =3D self.__GetFvInFd(RefFdName) if FvInFdList !=3D []: for FvNameInFd in FvInFdList: - LogStr +=3D "FD %s contains FV %s\n" % (RefFdN= ame,FvNameInFd) + LogStr +=3D "FD %s contains FV %s\n" % (RefFdN= ame, FvNameInFd) if FvNameInFd not in RefFvStack: RefFvStack.append(FvNameInFd) =20 @@ -4776,7 +4776,7 @@ class FdfParser: CapInFdList =3D self.__GetCapInFd(RefFdName) if CapInFdList !=3D []: for CapNameInFd in CapInFdList: - LogStr +=3D "FD %s contains Capsule %s\n" = % (RefFdName,CapNameInFd) + LogStr +=3D "FD %s contains Capsule %s\n" = % (RefFdName, CapNameInFd) if CapNameInFd not in RefCapStack: RefCapStack.append(CapNameInFd) =20 @@ -4787,7 +4787,7 @@ class FdfParser: FvInFdList =3D self.__GetFvInFd(RefFdName) if FvInFdList !=3D []: for FvNameInFd in FvInFdList: - LogStr +=3D "FD %s contains FV %s\n" % (Re= fFdName,FvNameInFd) + LogStr +=3D "FD %s contains FV %s\n" % (Re= fFdName, FvNameInFd) if FvNameInFd not in RefFvList: RefFvList.append(FvNameInFd) =20 diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/= Source/Python/GenFds/FfsInfStatement.py index b0b242be8d71..3a781d6d3a97 100644 --- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py +++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py @@ -430,7 +430,7 @@ class FfsInfStatement(FfsInfStatementClassObject): =20 self.__InfParse__(Dict) Arch =3D self.GetCurrentArch() - SrcFile =3D mws.join( GenFdsGlobalVariable.WorkSpaceDir , self.Inf= FileName); + SrcFile =3D mws.join( GenFdsGlobalVariable.WorkSpaceDir, self.InfF= ileName); DestFile =3D os.path.join( self.OutputPath, self.ModuleGuid + '.ff= s') =20 SrcFileDir =3D "." @@ -676,13 +676,13 @@ class FfsInfStatement(FfsInfStatementClassObject): Arch =3D self.CurrentArch =20 OutputPath =3D os.path.join(GenFdsGlobalVariable.OutputDirDict[Arc= h], - Arch , + Arch, ModulePath, FileName, 'OUTPUT' ) DebugPath =3D os.path.join(GenFdsGlobalVariable.OutputDirDict[Arch= ], - Arch , + Arch, ModulePath, FileName, 'DEBUG' @@ -944,9 +944,9 @@ class FfsInfStatement(FfsInfStatementClassObject): Sect.FvParentAddr =3D FvParentAddr =20 if Rule.KeyStringList !=3D []: - SectList, Align =3D Sect.GenSection(self.OutputPath , self= .ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile =3D IsMakefile) + SectList, Align =3D Sect.GenSection(self.OutputPath, self.= ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile =3D IsMakefile) else : - SectList, Align =3D Sect.GenSection(self.OutputPath , self= .ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile =3D IsMakefile) + SectList, Align =3D Sect.GenSection(self.OutputPath, self.= ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile =3D IsMakefile) =20 if not HasGeneratedFlag: UniVfrOffsetFileSection =3D "" =20 @@ -1124,7 +1124,7 @@ class FfsInfStatement(FfsInfStatementClassObject): try : SaveFileOnChange(UniVfrOffsetFileName, fStringIO.getvalue()) except: - EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to f= ile %s failed, please check whether the file been locked or using by other = applications." %UniVfrOffsetFileName,None) + EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to f= ile %s failed, please check whether the file been locked or using by other = applications." %UniVfrOffsetFileName, None) =20 fStringIO.close () =20 diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python= /GenFds/Fv.py index 615d9e39faf1..c64c0c80e299 100644 --- a/BaseTools/Source/Python/GenFds/Fv.py +++ b/BaseTools/Source/Python/GenFds/Fv.py @@ -386,8 +386,8 @@ class FV (FvClassObject): # check if the file path exists or not if not os.path.isfile(FileFullPath): GenFdsGlobalVariable.ErrorLogger("Error opening FV= Extension Header Entry file %s." % (self.FvExtEntryData[Index])) - FvExtFile =3D open (FileFullPath,'rb') - FvExtFile.seek(0,2) + FvExtFile =3D open (FileFullPath, 'rb') + FvExtFile.seek(0, 2) Size =3D FvExtFile.tell() if Size >=3D 0x10000: GenFdsGlobalVariable.ErrorLogger("The size of FV E= xtension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Inde= x])) diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/S= ource/Python/GenFds/FvImageSection.py index 916ff919176c..ac5d5891df70 100644 --- a/BaseTools/Source/Python/GenFds/FvImageSection.py +++ b/BaseTools/Source/Python/GenFds/FvImageSection.py @@ -64,7 +64,7 @@ class FvImageSection(FvImageSectionClassObject): for FvFileName in FileList: FvAlignmentValue =3D 0 if os.path.isfile(FvFileName): - FvFileObj =3D open (FvFileName,'rb') + FvFileObj =3D open (FvFileName, 'rb') FvFileObj.seek(0) # PI FvHeader is 0x48 byte FvHeaderBuffer =3D FvFileObj.read(0x48) @@ -112,7 +112,7 @@ class FvImageSection(FvImageSectionClassObject): if self.FvFileName !=3D None: FvFileName =3D GenFdsGlobalVariable.ReplaceWorkspaceMa= cro(self.FvFileName) if os.path.isfile(FvFileName): - FvFileObj =3D open (FvFileName,'rb') + FvFileObj =3D open (FvFileName, 'rb') FvFileObj.seek(0) # PI FvHeader is 0x48 byte FvHeaderBuffer =3D FvFileObj.read(0x48) diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseT= ools/Source/Python/GenFds/GenFdsGlobalVariable.py index 94b8fedb233b..d7fd58c7482f 100644 --- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py +++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py @@ -342,7 +342,7 @@ class GenFdsGlobalVariable: for Arch in ArchList: GenFdsGlobalVariable.OutputDirDict[Arch] =3D os.path.normpath( os.path.join(GlobalData.gWorkspace, - WorkSpace.Db.BuildObject[GenFdsGlobalVariable= .ActivePlatform, Arch,GlobalData.gGlobalDefines['TARGET'], + WorkSpace.Db.BuildObject[GenFdsGlobalVariable= .ActivePlatform, Arch, GlobalData.gGlobalDefines['TARGET'], GlobalData.gGlobalDefines['TOOLCHAIN']].Outpu= tDirectory, GlobalData.gGlobalDefines['TARGET'] +'_' + Gl= obalData.gGlobalDefines['TOOLCHAIN'])) GenFdsGlobalVariable.OutputDirFromDscDict[Arch] =3D os.path.no= rmpath( @@ -549,7 +549,7 @@ class GenFdsGlobalVariable: =20 GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs upda= te because of newer %s" % (Output, Input)) if MakefilePath: - if (tuple(Cmd),tuple(GenFdsGlobalVariable.SecCmdList),tuple(Ge= nFdsGlobalVariable.CopyList)) not in GenFdsGlobalVariable.FfsCmdDict.keys(): + if (tuple(Cmd), tuple(GenFdsGlobalVariable.SecCmdList), tuple(= GenFdsGlobalVariable.CopyList)) not in GenFdsGlobalVariable.FfsCmdDict.keys= (): GenFdsGlobalVariable.FfsCmdDict[tuple(Cmd), tuple(GenFdsGl= obalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)] =3D Makefil= ePath GenFdsGlobalVariable.SecCmdList =3D [] GenFdsGlobalVariable.CopyList =3D [] diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b= /BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py index 127385228fcf..dbbb4312f47e 100644 --- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py +++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py @@ -109,7 +109,7 @@ def _parseForGCC(lines, efifilepath): PcdName =3D m.groups(0)[0] m =3D re.match('^([\da-fA-Fx]+) +([\da-fA-Fx]+)', line= s[index + 1].strip()) if m !=3D None: - bpcds.append((PcdName, int(m.groups(0)[0], 16) , i= nt(sections[-1][1], 16), sections[-1][0])) + bpcds.append((PcdName, int(m.groups(0)[0], 16), in= t(sections[-1][1], 16), sections[-1][0])) =20 # get section information from efi file efisecs =3D PeImageClass(efifilepath).SectionHeaderList diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Sou= rce/Python/Pkcs7Sign/Pkcs7Sign.py index becf3e8eb9e8..1e07e23baeee 100644 --- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py +++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py @@ -89,7 +89,7 @@ if __name__ =3D=3D '__main__': parser.add_argument("--signature-size", dest=3D'SignatureSizeStr', type= =3Dstr, help=3D"specify the signature size for decode process.") parser.add_argument("-v", "--verbose", dest=3D'Verbose', action=3D"store= _true", help=3D"increase output messages") parser.add_argument("-q", "--quiet", dest=3D'Quiet', action=3D"store_tru= e", help=3D"reduce output messages") - parser.add_argument("--debug", dest=3D'Debug', type=3Dint, metavar=3D'[0= -9]', choices=3Dlist(range(0,10)), default=3D0, help=3D"set debug level") + parser.add_argument("--debug", dest=3D'Debug', type=3Dint, metavar=3D'[0= -9]', choices=3Dlist(range(0, 10)), default=3D0, help=3D"set debug level") parser.add_argument(metavar=3D"input_file", dest=3D'InputFile', type=3Da= rgparse.FileType('rb'), help=3D"specify the input filename") =20 # diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Generat= eKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateK= eys.py index 1641968ace0e..7d11758a795f 100644 --- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py +++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py @@ -52,7 +52,7 @@ if __name__ =3D=3D '__main__': parser.add_argument("--public-key-hash-c", dest=3D'PublicKeyHashCFile', = type=3Dargparse.FileType('wb'), help=3D"specify the public key hash filenam= e that is SHA 256 hash of 2048 bit RSA public key in C structure format") parser.add_argument("-v", "--verbose", dest=3D'Verbose', action=3D"store= _true", help=3D"increase output messages") parser.add_argument("-q", "--quiet", dest=3D'Quiet', action=3D"store_tru= e", help=3D"reduce output messages") - parser.add_argument("--debug", dest=3D'Debug', type=3Dint, metavar=3D'[0= -9]', choices=3Dlist(range(0,10)), default=3D0, help=3D"set debug level") + parser.add_argument("--debug", dest=3D'Debug', type=3Dint, metavar=3D'[0= -9]', choices=3Dlist(range(0, 10)), default=3D0, help=3D"set debug level") =20 # # Parse command line arguments diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py= b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py index 2a19ad973b91..e5f5a38bbc49 100644 --- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py +++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py @@ -51,7 +51,7 @@ EFI_HASH_ALGORITHM_SHA256_GUID =3D uuid.UUID('{51aa59de-f= df2-4ea3-bc63-875fb7842ee # UINT8 Signature[256]; # } EFI_CERT_BLOCK_RSA_2048_SHA256; # -EFI_CERT_BLOCK_RSA_2048_SHA256 =3D collections.namedtuple('EFI_CERT= _BLOCK_RSA_2048_SHA256', ['HashType','PublicKey','Signature']) +EFI_CERT_BLOCK_RSA_2048_SHA256 =3D collections.namedtuple('EFI_CERT= _BLOCK_RSA_2048_SHA256', ['HashType', 'PublicKey', 'Signature']) EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT =3D struct.Struct('16s256s256s') =20 # @@ -72,7 +72,7 @@ if __name__ =3D=3D '__main__': parser.add_argument("--private-key", dest=3D'PrivateKeyFile', type=3Darg= parse.FileType('rb'), help=3D"specify the private key filename. If not spe= cified, a test signing key is used.") parser.add_argument("-v", "--verbose", dest=3D'Verbose', action=3D"store= _true", help=3D"increase output messages") parser.add_argument("-q", "--quiet", dest=3D'Quiet', action=3D"store_tru= e", help=3D"reduce output messages") - parser.add_argument("--debug", dest=3D'Debug', type=3Dint, metavar=3D'[0= -9]', choices=3Dlist(range(0,10)), default=3D0, help=3D"set debug level") + parser.add_argument("--debug", dest=3D'Debug', type=3Dint, metavar=3D'[0= -9]', choices=3Dlist(range(0, 10)), default=3D0, help=3D"set debug level") parser.add_argument(metavar=3D"input_file", dest=3D'InputFile', type=3Da= rgparse.FileType('rb'), help=3D"specify the input filename") =20 # @@ -156,7 +156,7 @@ if __name__ =3D=3D '__main__': PublicKeyHexString =3D Process.communicate()[0].split('=3D')[1].strip() PublicKey =3D '' while len(PublicKeyHexString) > 0: - PublicKey =3D PublicKey + chr(int(PublicKeyHexString[0:2],16)) + PublicKey =3D PublicKey + chr(int(PublicKeyHexString[0:2], 16)) PublicKeyHexString=3DPublicKeyHexString[2:] if Process.returncode !=3D 0: sys.exit(Process.returncode) diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/S= ource/Python/TargetTool/TargetTool.py index ebed7a0ea7b8..fe74abb28901 100644 --- a/BaseTools/Source/Python/TargetTool/TargetTool.py +++ b/BaseTools/Source/Python/TargetTool/TargetTool.py @@ -59,11 +59,11 @@ class TargetTool(): def ConvertTextFileToDict(self, FileName, CommentCharacter, KeySplitCh= aracter): """Convert a text file to a dictionary of (name:value) pairs.""" try: - f =3D open(FileName,'r') + f =3D open(FileName, 'r') for Line in f: if Line.startswith(CommentCharacter) or Line.strip() =3D= =3D '': continue - LineList =3D Line.split(KeySplitCharacter,1) + LineList =3D Line.split(KeySplitCharacter, 1) if len(LineList) >=3D 2: Key =3D LineList[0].strip() if Key.startswith(CommentCharacter) =3D=3D False and K= ey in self.TargetTxtDictionary.keys(): @@ -104,7 +104,7 @@ class TargetTool(): if Line.startswith(CommentCharacter) or Line.strip() =3D= =3D '': fw.write(Line) else: - LineList =3D Line.split(KeySplitCharacter,1) + LineList =3D Line.split(KeySplitCharacter, 1) if len(LineList) >=3D 2: Key =3D LineList[0].strip() if Key.startswith(CommentCharacter) =3D=3D False a= nd Key in self.TargetTxtDictionary.keys(): @@ -203,14 +203,14 @@ def RangeCheckCallback(option, opt_str, value, parser= ): parser.error("Option %s only allows one instance in command line!"= % option) =20 def MyOptionParser(): - parser =3D OptionParser(version=3D__version__,prog=3D"TargetTool.exe",= usage=3D__usage__,description=3D__copyright__) - parser.add_option("-a", "--arch", action=3D"append", type=3D"choice", = choices=3D['IA32','X64','IPF','EBC', 'ARM', 'AARCH64','0'], dest=3D"TARGET_= ARCH", + parser =3D OptionParser(version=3D__version__, prog=3D"TargetTool.exe"= , usage=3D__usage__, description=3D__copyright__) + parser.add_option("-a", "--arch", action=3D"append", type=3D"choice", = choices=3D['IA32', 'X64', 'IPF', 'EBC', 'ARM', 'AARCH64', '0'], dest=3D"TAR= GET_ARCH", help=3D"ARCHS is one of list: IA32, X64, IPF, ARM, AARCH64 or EBC,= which replaces target.txt's TARGET_ARCH definition. To specify more archs,= please repeat this option. 0 will clear this setting in target.txt and can= 't combine with other value.") parser.add_option("-p", "--platform", action=3D"callback", type=3D"str= ing", dest=3D"DSCFILE", callback=3DSingleCheckCallback, help=3D"Specify a DSC file, which replace target.txt's ACTIVE_PLAT= FORM definition. 0 will clear this setting in target.txt and can't combine = with other value.") parser.add_option("-c", "--tooldef", action=3D"callback", type=3D"stri= ng", dest=3D"TOOL_DEFINITION_FILE", callback=3DSingleCheckCallback, help=3D"Specify the WORKSPACE relative path of tool_def.txt file, = which replace target.txt's TOOL_CHAIN_CONF definition. 0 will clear this se= tting in target.txt and can't combine with other value.") - parser.add_option("-t", "--target", action=3D"append", type=3D"choice"= , choices=3D['DEBUG','RELEASE','0'], dest=3D"TARGET", + parser.add_option("-t", "--target", action=3D"append", type=3D"choice"= , choices=3D['DEBUG', 'RELEASE', '0'], dest=3D"TARGET", help=3D"TARGET is one of list: DEBUG, RELEASE, which replaces targ= et.txt's TARGET definition. To specify more TARGET, please repeat this opti= on. 0 will clear this setting in target.txt and can't combine with other va= lue.") parser.add_option("-n", "--tagname", action=3D"callback", type=3D"stri= ng", dest=3D"TOOL_CHAIN_TAG", callback=3DSingleCheckCallback, help=3D"Specify the Tool Chain Tagname, which replaces target.txt'= s TOOL_CHAIN_TAG definition. 0 will clear this setting in target.txt and ca= n't combine with other value.") diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python= /Trim/Trim.py index 94f6b1bc707a..af1bf9de3e00 100644 --- a/BaseTools/Source/Python/Trim/Trim.py +++ b/BaseTools/Source/Python/Trim/Trim.py @@ -261,7 +261,7 @@ def TrimPreprocessedVfr(Source, Target): CreateDirectory(os.path.dirname(Target)) =20 try: - f =3D open (Source,'r') + f =3D open (Source, 'r') except: EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=3DSource) # read whole file @@ -310,7 +310,7 @@ def TrimPreprocessedVfr(Source, Target): =20 # save all lines trimmed try: - f =3D open (Target,'w') + f =3D open (Target, 'w') except: EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=3DTarget) f.writelines(Lines) @@ -407,7 +407,7 @@ def TrimAslFile(Source, Target, IncludePathFile): if IncludePathFile: try: LineNum =3D 0 - for Line in open(IncludePathFile,'r'): + for Line in open(IncludePathFile, 'r'): LineNum +=3D 1 if Line.startswith("/I") or Line.startswith ("-I"): IncludePathList.append(Line[2:].strip()) @@ -425,7 +425,7 @@ def TrimAslFile(Source, Target, IncludePathFile): =20 # save all lines trimmed try: - f =3D open (Target,'w') + f =3D open (Target, 'w') except: EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=3DTarget) =20 @@ -560,7 +560,7 @@ def TrimEdkSourceCode(Source, Target): CreateDirectory(os.path.dirname(Target)) =20 try: - f =3D open (Source,'rb') + f =3D open (Source, 'rb') except: EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=3DSource) # read whole file @@ -568,7 +568,7 @@ def TrimEdkSourceCode(Source, Target): f.close() =20 NewLines =3D None - for Re,Repl in gImportCodePatterns: + for Re, Repl in gImportCodePatterns: if NewLines =3D=3D None: NewLines =3D Re.sub(Repl, Lines) else: @@ -579,7 +579,7 @@ def TrimEdkSourceCode(Source, Target): return =20 try: - f =3D open (Target,'wb') + f =3D open (Target, 'wb') except: EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=3DTarget) f.write(NewLines) diff --git a/BaseTools/Source/Python/UPT/Core/DependencyRules.py b/BaseTool= s/Source/Python/UPT/Core/DependencyRules.py index 3a7c9809e31a..203f973669f3 100644 --- a/BaseTools/Source/Python/UPT/Core/DependencyRules.py +++ b/BaseTools/Source/Python/UPT/Core/DependencyRules.py @@ -285,8 +285,8 @@ class DependencyRules(object): pass DecPath =3D dirname(DecFile) if DecPath.find(WorkSP) > -1: - InstallPath =3D GetRelativePath(DecPath,WorkSP) - DecFileRelaPath =3D GetRelativePath(DecFile,WorkSP) + InstallPath =3D GetRelativePath(DecPath, WorkSP) + DecFileRelaPath =3D GetRelativePath(DecFile, WorkSP) else: InstallPath =3D DecPath DecFileRelaPath =3D DecFile @@ -348,8 +348,8 @@ class DependencyRules(object): pass DecPath =3D dirname(DecFile) if DecPath.find(WorkSP) > -1: - InstallPath =3D GetRelativePath(DecPath,WorkSP) - DecFileRelaPath =3D GetRelativePath(DecFile,WorkSP) + InstallPath =3D GetRelativePath(DecPath, WorkSP) + DecFileRelaPath =3D GetRelativePath(DecFile, WorkSP) else: InstallPath =3D DecPath DecFileRelaPath =3D DecFile diff --git a/BaseTools/Source/Python/UPT/Core/IpiDb.py b/BaseTools/Source/P= ython/UPT/Core/IpiDb.py index baf687ef99ba..44187a1ee40f 100644 --- a/BaseTools/Source/Python/UPT/Core/IpiDb.py +++ b/BaseTools/Source/Python/UPT/Core/IpiDb.py @@ -459,7 +459,7 @@ class IpiDatabase(object): (select InstallPath from ModInPkgInfo where=20 ModInPkgInfo.PackageGuid =3D'%s'=20 and ModInPkgInfo.PackageVersion =3D '%s')""" \ - % (Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg= [1],Pkg[0], Pkg[1]) + % (Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg= [1], Pkg[0], Pkg[1]) =20 self.Cur.execute(SqlCommand) # @@ -921,7 +921,7 @@ class IpiDatabase(object): def __ConvertToSqlString(self, StringList): if self.DpTable: pass - return map(lambda s: s.replace("'", "''") , StringList) + return map(lambda s: s.replace("'", "''"), StringList) =20 =20 =20 diff --git a/BaseTools/Source/Python/UPT/Library/String.py b/BaseTools/Sour= ce/Python/UPT/Library/String.py index 2f916324bd13..de3035279f01 100644 --- a/BaseTools/Source/Python/UPT/Library/String.py +++ b/BaseTools/Source/Python/UPT/Library/String.py @@ -633,7 +633,7 @@ def SplitString(String): # @param StringList: A list for strings to be converted # def ConvertToSqlString(StringList): - return map(lambda s: s.replace("'", "''") , StringList) + return map(lambda s: s.replace("'", "''"), StringList) =20 ## Convert To Sql String # diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/Ba= seTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py index 4c28b7f5d22a..1e0c79d6677d 100644 --- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py +++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py @@ -649,7 +649,7 @@ class DecPomAlignment(PackageObject): ContainerFile, (Item.TokenSpaceGuidCName, Item.TokenCName, Item.DefaultValue, Item.DatumType, Item.TokenValue, - Type, Item.GetHeadComment(), Item.GetTailComment()= ,''), + Type, Item.GetHeadComment(), Item.GetTailComment()= , ''), Language, self.DecParser.GetDefineSectionMacro() ) diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/U= PT/UPT.py index 84b3c353201a..12f091dd421b 100644 --- a/BaseTools/Source/Python/UPT/UPT.py +++ b/BaseTools/Source/Python/UPT/UPT.py @@ -315,7 +315,7 @@ def Main(): GlobalData.gDB.CloseDb() =20 if pf.system() =3D=3D 'Windows': - os.system('subst %s /D' % GlobalData.gWORKSPACE.replace('\\','= ')) + os.system('subst %s /D' % GlobalData.gWORKSPACE.replace('\\', = '')) =20 return ReturnCode =20 diff --git a/BaseTools/Source/Python/UPT/Xml/CommonXml.py b/BaseTools/Sourc= e/Python/UPT/Xml/CommonXml.py index e28aec5b9b05..498fe938aeab 100644 --- a/BaseTools/Source/Python/UPT/Xml/CommonXml.py +++ b/BaseTools/Source/Python/UPT/Xml/CommonXml.py @@ -355,7 +355,7 @@ class PackageHeaderXml(object): def FromXml(self, Item, Key, PackageObject2): if not Item: XmlTreeLevel =3D ['DistributionPackage', 'PackageSurfaceArea'] - CheckDict =3D {'PackageHeader':None, } + CheckDict =3D {'PackageHeader': None, } IsRequiredItemListNull(CheckDict, XmlTreeLevel) self.PackagePath =3D XmlElement(Item, '%s/PackagePath' % Key) self.Header.FromXml(Item, Key) diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParser.py b/BaseTools/Sourc= e/Python/UPT/Xml/XmlParser.py index b4d52f7bdc1f..bd7be102057a 100644 --- a/BaseTools/Source/Python/UPT/Xml/XmlParser.py +++ b/BaseTools/Source/Python/UPT/Xml/XmlParser.py @@ -104,7 +104,7 @@ class DistributionPackageXml(object): IsRequiredItemListNull(CheckDict, XmlTreeLevel) else: XmlTreeLevel =3D ['DistributionPackage', 'DistributionHead= er'] - CheckDict =3D CheckDict =3D {'DistributionHeader':'', } + CheckDict =3D CheckDict =3D {'DistributionHeader': '', } IsRequiredItemListNull(CheckDict, XmlTreeLevel) =20 # @@ -124,16 +124,16 @@ class DistributionPackageXml(object): # if self.DistP.Tools: XmlTreeLevel =3D ['DistributionPackage', 'Tools', 'Header'] - CheckDict =3D {'Name':self.DistP.Tools.GetName(), } + CheckDict =3D {'Name': self.DistP.Tools.GetName(), } IsRequiredItemListNull(CheckDict, XmlTreeLevel) =20 if not self.DistP.Tools.GetFileList(): XmlTreeLevel =3D ['DistributionPackage', 'Tools'] - CheckDict =3D {'FileName':None, } + CheckDict =3D {'FileName': None, } IsRequiredItemListNull(CheckDict, XmlTreeLevel) for Item in self.DistP.Tools.GetFileList(): XmlTreeLevel =3D ['DistributionPackage', 'Tools'] - CheckDict =3D {'FileName':Item.GetURI(), } + CheckDict =3D {'FileName': Item.GetURI(), } IsRequiredItemListNull(CheckDict, XmlTreeLevel) =20 # @@ -141,16 +141,16 @@ class DistributionPackageXml(object): # if self.DistP.MiscellaneousFiles: XmlTreeLevel =3D ['DistributionPackage', 'MiscellaneousFil= es', 'Header'] - CheckDict =3D {'Name':self.DistP.MiscellaneousFiles.GetNam= e(), } + CheckDict =3D {'Name': self.DistP.MiscellaneousFiles.GetNa= me(), } IsRequiredItemListNull(CheckDict, XmlTreeLevel) =20 if not self.DistP.MiscellaneousFiles.GetFileList(): XmlTreeLevel =3D ['DistributionPackage', 'Miscellaneou= sFiles'] - CheckDict =3D {'FileName':None, } + CheckDict =3D {'FileName': None, } IsRequiredItemListNull(CheckDict, XmlTreeLevel) for Item in self.DistP.MiscellaneousFiles.GetFileList(): XmlTreeLevel =3D ['DistributionPackage', 'Miscellaneou= sFiles'] - CheckDict =3D {'FileName':Item.GetURI(), } + CheckDict =3D {'FileName': Item.GetURI(), } IsRequiredItemListNull(CheckDict, XmlTreeLevel) =20 # @@ -158,7 +158,7 @@ class DistributionPackageXml(object): # for Item in self.DistP.UserExtensions: XmlTreeLevel =3D ['DistributionPackage', 'UserExtensions'] - CheckDict =3D {'UserId':Item.GetUserID(), } + CheckDict =3D {'UserId': Item.GetUserID(), } IsRequiredItemListNull(CheckDict, XmlTreeLevel) =20 =20 @@ -450,10 +450,10 @@ def ValidateMS1(Module, TopXmlTreeLevel): XmlTreeLevel =3D TopXmlTreeLevel + ['MiscellaneousFiles'] for Item in Module.GetMiscFileList(): if not Item.GetFileList(): - CheckDict =3D {'Filename':'', } + CheckDict =3D {'Filename': '', } IsRequiredItemListNull(CheckDict, XmlTreeLevel) for File in Item.GetFileList(): - CheckDict =3D {'Filename':File.GetURI(), } + CheckDict =3D {'Filename': File.GetURI(), } =20 ## ValidateMS2 # @@ -916,10 +916,10 @@ def ValidatePS2(Package): XmlTreeLevel =3D ['DistributionPackage', 'PackageSurfaceArea', 'Miscel= laneousFiles'] for Item in Package.GetMiscFileList(): if not Item.GetFileList(): - CheckDict =3D {'Filename':'', } + CheckDict =3D {'Filename': '', } IsRequiredItemListNull(CheckDict, XmlTreeLevel) for File in Item.GetFileList(): - CheckDict =3D {'Filename':File.GetURI(), } + CheckDict =3D {'Filename': File.GetURI(), } IsRequiredItemListNull(CheckDict, XmlTreeLevel) =20 ## ValidatePackageSurfaceArea diff --git a/BaseTools/Source/Python/Workspace/DecBuildData.py b/BaseTools/= Source/Python/Workspace/DecBuildData.py index 2fd3820dcc86..629df18fcbff 100644 --- a/BaseTools/Source/Python/Workspace/DecBuildData.py +++ b/BaseTools/Source/Python/Workspace/DecBuildData.py @@ -365,16 +365,16 @@ class DecBuildData(PackageBuildClassObject): =20 def ProcessStructurePcd(self, StructurePcdRawDataSet): s_pcd_set =3D dict() - for s_pcd,LineNo in StructurePcdRawDataSet: + for s_pcd, LineNo in StructurePcdRawDataSet: if s_pcd.TokenSpaceGuidCName not in s_pcd_set: s_pcd_set[s_pcd.TokenSpaceGuidCName] =3D [] - s_pcd_set[s_pcd.TokenSpaceGuidCName].append((s_pcd,LineNo)) + s_pcd_set[s_pcd.TokenSpaceGuidCName].append((s_pcd, LineNo)) =20 str_pcd_set =3D [] for pcdname in s_pcd_set: dep_pkgs =3D [] struct_pcd =3D StructurePcd() - for item,LineNo in s_pcd_set[pcdname]: + for item, LineNo in s_pcd_set[pcdname]: if "" in item.TokenCName: struct_pcd.StructuredPcdIncludeFile.append(item.Defaul= tValue) elif "" in item.TokenCName: @@ -386,7 +386,7 @@ class DecBuildData(PackageBuildClassObject): struct_pcd.PcdDefineLineNo =3D LineNo struct_pcd.PkgPath =3D self.MetaFile.File else: - struct_pcd.AddDefaultValue(item.TokenCName, item.Defau= ltValue,self.MetaFile.File,LineNo) + struct_pcd.AddDefaultValue(item.TokenCName, item.Defau= ltValue, self.MetaFile.File, LineNo) =20 struct_pcd.PackageDecs =3D dep_pkgs =20 @@ -409,7 +409,7 @@ class DecBuildData(PackageBuildClassObject): StrPcdSet =3D [] RecordList =3D self._RawData[Type, self._Arch] for TokenSpaceGuid, PcdCName, Setting, Arch, PrivateFlag, Dummy1, = Dummy2 in RecordList: - PcdDict[Arch, PcdCName, TokenSpaceGuid] =3D (Setting,Dummy2) + PcdDict[Arch, PcdCName, TokenSpaceGuid] =3D (Setting, Dummy2) if not (PcdCName, TokenSpaceGuid) in PcdSet: PcdSet.append((PcdCName, TokenSpaceGuid)) =20 @@ -418,7 +418,7 @@ class DecBuildData(PackageBuildClassObject): # limit the ARCH to self._Arch, if no self._Arch found, tdict # will automatically turn to 'common' ARCH and try again # - Setting,LineNo =3D PcdDict[self._Arch, PcdCName, TokenSpaceGui= d] + Setting, LineNo =3D PcdDict[self._Arch, PcdCName, TokenSpaceGu= id] if Setting =3D=3D None: continue =20 @@ -440,7 +440,7 @@ class DecBuildData(PackageBuildClassObject): list(expressions) ) if "." in TokenSpaceGuid: - StrPcdSet.append((PcdObj,LineNo)) + StrPcdSet.append((PcdObj, LineNo)) else: Pcds[PcdCName, TokenSpaceGuid, self._PCD_TYPE_STRING_[Type= ]] =3D PcdObj =20 diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/= Source/Python/Workspace/DscBuildData.py index e9fe533b3975..b08bdfbc4f4e 100644 --- a/BaseTools/Source/Python/Workspace/DscBuildData.py +++ b/BaseTools/Source/Python/Workspace/DscBuildData.py @@ -591,12 +591,12 @@ class DscBuildData(PlatformBuildClassObject): File=3Dself.MetaFile, Line=3DRecord[-1= ]) self._SkuIds[Record[1].upper()] =3D (str(self.ToInt(Record= [0])), Record[1].upper(), Record[2].upper()) if 'DEFAULT' not in self._SkuIds: - self._SkuIds['DEFAULT'] =3D ("0","DEFAULT","DEFAULT") + self._SkuIds['DEFAULT'] =3D ("0", "DEFAULT", "DEFAULT") if 'COMMON' not in self._SkuIds: - self._SkuIds['COMMON'] =3D ("0","DEFAULT","DEFAULT") + self._SkuIds['COMMON'] =3D ("0", "DEFAULT", "DEFAULT") return self._SkuIds - def ToInt(self,intstr): - return int(intstr,16) if intstr.upper().startswith("0X") else int(= intstr) + def ToInt(self, intstr): + return int(intstr, 16) if intstr.upper().startswith("0X") else int= (intstr) def _GetDefaultStores(self): if self.DefaultStores =3D=3D None: self.DefaultStores =3D sdict() @@ -616,9 +616,9 @@ class DscBuildData(PlatformBuildClassObject): if not IsValidWord(Record[1]): EdkLogger.error('build', FORMAT_INVALID, "The format o= f the DefaultStores ID name is invalid. The correct format is '(a-zA-Z0-9_)= (a-zA-Z0-9_-.)*'", File=3Dself.MetaFile, Line=3DRecord[-1= ]) - self.DefaultStores[Record[1].upper()] =3D (self.ToInt(Reco= rd[0]),Record[1].upper()) + self.DefaultStores[Record[1].upper()] =3D (self.ToInt(Reco= rd[0]), Record[1].upper()) if TAB_DEFAULT_STORES_DEFAULT not in self.DefaultStores: - self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] =3D (0,TAB_= DEFAULT_STORES_DEFAULT) + self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] =3D (0, TAB= _DEFAULT_STORES_DEFAULT) GlobalData.gDefaultStores =3D self.DefaultStores.keys() if GlobalData.gDefaultStores: GlobalData.gDefaultStores.sort() @@ -678,7 +678,7 @@ class DscBuildData(PlatformBuildClassObject): for Type in [MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_= MODULE, \ MODEL_PCD_FEATURE_FLAG, MODEL_PCD_DYNAMIC, MODEL_= PCD_DYNAMIC_EX]: RecordList =3D self._RawData[Type, self._Arch, None, Modul= eId] - for TokenSpaceGuid, PcdCName, Setting, Dummy1, Dummy2, Dum= my3, Dummy4,Dummy5 in RecordList: + for TokenSpaceGuid, PcdCName, Setting, Dummy1, Dummy2, Dum= my3, Dummy4, Dummy5 in RecordList: TokenList =3D GetSplitValueList(Setting) DefaultValue =3D TokenList[0] if len(TokenList) > 1: @@ -702,7 +702,7 @@ class DscBuildData(PlatformBuildClassObject): =20 # get module private build options RecordList =3D self._RawData[MODEL_META_DATA_BUILD_OPTION, sel= f._Arch, None, ModuleId] - for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3= , Dummy4,Dummy5 in RecordList: + for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3= , Dummy4, Dummy5 in RecordList: if (ToolChainFamily, ToolChain) not in Module.BuildOptions: Module.BuildOptions[ToolChainFamily, ToolChain] =3D Op= tion else: @@ -742,7 +742,7 @@ class DscBuildData(PlatformBuildClassObject): RecordList =3D self._RawData[MODEL_EFI_LIBRARY_CLASS, self._Ar= ch, None, -1] Macros =3D self._Macros for Record in RecordList: - LibraryClass, LibraryInstance, Dummy, Arch, ModuleType, Du= mmy,Dummy, LineNo =3D Record + LibraryClass, LibraryInstance, Dummy, Arch, ModuleType, Du= mmy, Dummy, LineNo =3D Record if LibraryClass =3D=3D '' or LibraryClass =3D=3D 'NULL': self._NullLibraryNumber +=3D 1 LibraryClass =3D 'NULL%d' % self._NullLibraryNumber @@ -809,7 +809,7 @@ class DscBuildData(PlatformBuildClassObject): ModuleData =3D self._Bdb[ModuleFile, self._Arch, self._Tar= get, self._Toolchain] PkgSet.update(ModuleData.Packages) =20 - self._DecPcds, self._GuidDict =3D GetDeclaredPcd(self, self._B= db, self._Arch, self._Target, self._Toolchain,PkgSet) + self._DecPcds, self._GuidDict =3D GetDeclaredPcd(self, self._B= db, self._Arch, self._Target, self._Toolchain, PkgSet) =20 =20 if (PcdCName, TokenSpaceGuid) not in self._DecPcds: @@ -854,14 +854,14 @@ class DscBuildData(PlatformBuildClassObject): ExtraData=3D"%s.%s" % (TokenSpaceGuid, Pcd= CName)) if PcdType in (MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX= _DEFAULT): if self._DecPcds[PcdCName, TokenSpaceGuid].DatumType.strip= () !=3D ValueList[1].strip(): - EdkLogger.error('build', FORMAT_INVALID, "Pcd datumtyp= e used in DSC file is not the same as its declaration in DEC file." , File= =3Dself.MetaFile, Line=3DLineNo, + EdkLogger.error('build', FORMAT_INVALID, "Pcd datumtyp= e used in DSC file is not the same as its declaration in DEC file.", File= =3Dself.MetaFile, Line=3DLineNo, ExtraData=3D"%s.%s|%s" % (TokenSpaceGuid, = PcdCName, Setting)) if (TokenSpaceGuid + '.' + PcdCName) in GlobalData.gPlatformPcds: if GlobalData.gPlatformPcds[TokenSpaceGuid + '.' + PcdCName] != =3D ValueList[Index]: GlobalData.gPlatformPcds[TokenSpaceGuid + '.' + PcdCName] = =3D ValueList[Index] return ValueList =20 - def _FilterPcdBySkuUsage(self,Pcds): + def _FilterPcdBySkuUsage(self, Pcds): available_sku =3D self.SkuIdMgr.AvailableSkuIdSet sku_usage =3D self.SkuIdMgr.SkuUsageType if sku_usage =3D=3D SkuClass.SINGLE: @@ -877,7 +877,7 @@ class DscBuildData(PlatformBuildClassObject): if type(pcd) is StructurePcd and pcd.SkuOverrideValues: Pcds[pcdname].SkuOverrideValues =3D {skuid:pcd.SkuOver= rideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_= sku} return Pcds - def CompleteHiiPcdsDefaultStores(self,Pcds): + def CompleteHiiPcdsDefaultStores(self, Pcds): HiiPcd =3D [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in [self._= PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_D= YNAMIC_EX_HII]]] DefaultStoreMgr =3D DefaultStore(self.DefaultStores) for pcd in HiiPcd: @@ -894,15 +894,15 @@ class DscBuildData(PlatformBuildClassObject): if GlobalData.BuildOptionPcd: for pcd in GlobalData.BuildOptionPcd: if pcd[2] =3D=3D "": - pcdset.append((pcd[0],pcd[1],pcd[3])) + pcdset.append((pcd[0], pcd[1], pcd[3])) else: - pcdobj =3D self._Pcds.get((pcd[1],pcd[0])) + pcdobj =3D self._Pcds.get((pcd[1], pcd[0])) if pcdobj: - pcdset.append((pcd[0],pcd[1], pcdobj.DefaultValue)) + pcdset.append((pcd[0], pcd[1], pcdobj.DefaultValue= )) else: - pcdset.append((pcd[0],pcd[1],pcd[3])) + pcdset.append((pcd[0], pcd[1], pcd[3])) GlobalData.BuildOptionPcd =3D pcdset - def GetFieldValueFromComm(self,ValueStr,TokenSpaceGuidCName, TokenCNam= e, FieldName): + def GetFieldValueFromComm(self, ValueStr, TokenSpaceGuidCName, TokenCN= ame, FieldName): PredictedFieldType =3D "VOID*" if ValueStr.startswith('L'): if not ValueStr[1]: @@ -941,10 +941,10 @@ class DscBuildData(PlatformBuildClassObject): if not pcdvalue: EdkLogger.error('build', AUTOGEN_ERROR, "No Value spec= ified for the PCD %s." % (pcdname)) if '.' in pcdname: - (Name1, Name2) =3D pcdname.split('.',1) + (Name1, Name2) =3D pcdname.split('.', 1) if "." in Name2: - (Name3, FieldName) =3D Name2.split(".",1) - if ((Name3,Name1)) in self.DecPcds: + (Name3, FieldName) =3D Name2.split(".", 1) + if ((Name3, Name1)) in self.DecPcds: HasTokenSpace =3D True TokenCName =3D Name3 TokenSpaceGuidCName =3D Name1 @@ -954,7 +954,7 @@ class DscBuildData(PlatformBuildClassObject): TokenSpaceGuidCName =3D '' HasTokenSpace =3D False else: - if ((Name2,Name1)) in self.DecPcds: + if ((Name2, Name1)) in self.DecPcds: HasTokenSpace =3D True TokenCName =3D Name2 TokenSpaceGuidCName =3D Name1 @@ -990,7 +990,7 @@ class DscBuildData(PlatformBuildClassObject): FoundFlag =3D True if FieldName: NewValue =3D self.GetFieldValueFromComm(pcdvalue, Toke= nSpaceGuidCName, TokenCName, FieldName) - GlobalData.BuildOptionPcd[i] =3D (TokenSpaceGuidCName,= TokenCName, FieldName,NewValue,("build command options",1)) + GlobalData.BuildOptionPcd[i] =3D (TokenSpaceGuidCName,= TokenCName, FieldName, NewValue, ("build command options", 1)) else: for key in self.DecPcds: PcdItem =3D self.DecPcds[key] @@ -1029,7 +1029,7 @@ class DscBuildData(PlatformBuildClassObject): AUTOGEN_ERROR, "The Pcd %s is found under= multiple different TokenSpaceGuid: %s and %s." % (TokenCName, PcdItem.Toke= nSpaceGuidCName, TokenSpaceGuidCNameList[0]) ) - GlobalData.BuildOptionPcd[i] =3D (TokenSpaceGuidCName,= TokenCName, FieldName,NewValue,("build command options",1)) + GlobalData.BuildOptionPcd[i] =3D (TokenSpaceGuidCName,= TokenCName, FieldName, NewValue, ("build command options", 1)) if not FoundFlag: if HasTokenSpace: EdkLogger.error('build', AUTOGEN_ERROR, "The Pcd %= s.%s is not found in the DEC file." % (TokenSpaceGuidCName, TokenCName)) @@ -1065,17 +1065,17 @@ class DscBuildData(PlatformBuildClassObject): self.RecoverCommandLinePcd() return self._Pcds =20 - def _dumpPcdInfo(self,Pcds): + def _dumpPcdInfo(self, Pcds): for pcd in Pcds: pcdobj =3D Pcds[pcd] if not pcdobj.TokenCName.startswith("Test"): continue for skuid in pcdobj.SkuInfoList: - if pcdobj.Type in (self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMI= C_HII],self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]): + if pcdobj.Type in (self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMI= C_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]): for storename in pcdobj.SkuInfoList[skuid].DefaultStor= eDict: - print("PcdCName: %s, SkuName: %s, StoreName: %s, V= alue: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), sku= id,storename,str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename]))) + print("PcdCName: %s, SkuName: %s, StoreName: %s, V= alue: %s" % (".".join((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), sku= id, storename, str(pcdobj.SkuInfoList[skuid].DefaultStoreDict[storename]))) else: - print("PcdCName: %s, SkuName: %s, Value: %s" % (".".jo= in((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid,str(pcdobj.SkuIn= foList[skuid].DefaultValue))) + print("PcdCName: %s, SkuName: %s, Value: %s" % (".".jo= in((pcdobj.TokenSpaceGuidCName, pcdobj.TokenCName)), skuid, str(pcdobj.SkuI= nfoList[skuid].DefaultValue))) ## Retrieve [BuildOptions] def _GetBuildOptions(self): if self._BuildOptions =3D=3D None: @@ -1085,7 +1085,7 @@ class DscBuildData(PlatformBuildClassObject): # for CodeBase in (EDKII_NAME, EDK_NAME): RecordList =3D self._RawData[MODEL_META_DATA_BUILD_OPTION,= self._Arch, CodeBase] - for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Du= mmy3, Dummy4,Dummy5 in RecordList: + for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Du= mmy3, Dummy4, Dummy5 in RecordList: if Dummy3.upper() !=3D 'COMMON': continue CurKey =3D (ToolChainFamily, ToolChain, CodeBase) @@ -1108,7 +1108,7 @@ class DscBuildData(PlatformBuildClassObject): DriverType =3D '%s.%s' % (Edk, ModuleType) CommonDriverType =3D '%s.%s' % ('COMMON', ModuleType) RecordList =3D self._RawData[MODEL_META_DATA_BUILD_OPTION, sel= f._Arch] - for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3= , Dummy4,Dummy5 in RecordList: + for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3= , Dummy4, Dummy5 in RecordList: Type =3D Dummy2 + '.' + Dummy3 if Type.upper() =3D=3D DriverType.upper() or Type.upper() = =3D=3D CommonDriverType.upper(): Key =3D (ToolChainFamily, ToolChain, Edk) @@ -1122,28 +1122,28 @@ class DscBuildData(PlatformBuildClassObject): def GetStructurePcdInfo(self, PcdSet): structure_pcd_data =3D {} for item in PcdSet: - if (item[0],item[1]) not in structure_pcd_data: - structure_pcd_data[(item[0],item[1])] =3D [] - structure_pcd_data[(item[0],item[1])].append(item) + if (item[0], item[1]) not in structure_pcd_data: + structure_pcd_data[(item[0], item[1])] =3D [] + structure_pcd_data[(item[0], item[1])].append(item) =20 return structure_pcd_data - def OverrideByFdfComm(self,StruPcds): - StructurePcdInCom =3D {(item[0],item[1],item[2] ):(item[3],item[4]= ) for item in GlobalData.BuildOptionPcd if len(item) =3D=3D 5 and (item[1],= item[0]) in StruPcds } if GlobalData.BuildOptionPcd else {} - GlobalPcds =3D set([(item[0],item[1]) for item in StructurePcdInCo= m.keys()]) + def OverrideByFdfComm(self, StruPcds): + StructurePcdInCom =3D {(item[0], item[1], item[2] ):(item[3], item= [4]) for item in GlobalData.BuildOptionPcd if len(item) =3D=3D 5 and (item[= 1], item[0]) in StruPcds } if GlobalData.BuildOptionPcd else {} + GlobalPcds =3D set([(item[0], item[1]) for item in StructurePcdInC= om.keys()]) for Pcd in StruPcds.values(): - if (Pcd.TokenSpaceGuidCName,Pcd.TokenCName) not in GlobalPcds: + if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) not in GlobalPcds: continue - FieldValues =3D {item[2]:StructurePcdInCom[item] for item in S= tructurePcdInCom if (Pcd.TokenSpaceGuidCName,Pcd.TokenCName) =3D=3D (item[0= ],item[1]) and item[2]} + FieldValues =3D {item[2]:StructurePcdInCom[item] for item in S= tructurePcdInCom if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) =3D=3D (item[= 0], item[1]) and item[2]} for sku in Pcd.SkuOverrideValues: for defaultstore in Pcd.SkuOverrideValues[sku]: for field in FieldValues: if field not in Pcd.SkuOverrideValues[sku][default= store]: - Pcd.SkuOverrideValues[sku][defaultstore][field= ] =3D ["","",""] + Pcd.SkuOverrideValues[sku][defaultstore][field= ] =3D ["", "", ""] Pcd.SkuOverrideValues[sku][defaultstore][field][0]= =3D FieldValues[field][0] Pcd.SkuOverrideValues[sku][defaultstore][field][1]= =3D FieldValues[field][1][0] Pcd.SkuOverrideValues[sku][defaultstore][field][2]= =3D FieldValues[field][1][1] return StruPcds - def OverrideByFdfCommOverAll(self,AllPcds): + def OverrideByFdfCommOverAll(self, AllPcds): def CheckStructureInComm(commpcds): if not commpcds: return False @@ -1152,29 +1152,29 @@ class DscBuildData(PlatformBuildClassObject): return False =20 if CheckStructureInComm(GlobalData.BuildOptionPcd): - StructurePcdInCom =3D {(item[0],item[1],item[2] ):(item[3],ite= m[4]) for item in GlobalData.BuildOptionPcd } if GlobalData.BuildOptionPcd = else {} - NoFiledValues =3D {(item[0],item[1]):StructurePcdInCom[item] f= or item in StructurePcdInCom if not item[2]} + StructurePcdInCom =3D {(item[0], item[1], item[2] ):(item[3], = item[4]) for item in GlobalData.BuildOptionPcd } if GlobalData.BuildOptionP= cd else {} + NoFiledValues =3D {(item[0], item[1]):StructurePcdInCom[item] = for item in StructurePcdInCom if not item[2]} else: - NoFiledValues =3D {(item[0],item[1]):[item[2]] for item in Glo= balData.BuildOptionPcd} - for Guid,Name in NoFiledValues: - if (Name,Guid) in AllPcds: - Pcd =3D AllPcds.get((Name,Guid)) - Pcd.DefaultValue =3D NoFiledValues[(Pcd.TokenSpaceGuidCNam= e,Pcd.TokenCName)][0] + NoFiledValues =3D {(item[0], item[1]):[item[2]] for item in Gl= obalData.BuildOptionPcd} + for Guid, Name in NoFiledValues: + if (Name, Guid) in AllPcds: + Pcd =3D AllPcds.get((Name, Guid)) + Pcd.DefaultValue =3D NoFiledValues[(Pcd.TokenSpaceGuidCNam= e, Pcd.TokenCName)][0] for sku in Pcd.SkuInfoList: SkuInfo =3D Pcd.SkuInfoList[sku] if SkuInfo.DefaultValue: - SkuInfo.DefaultValue =3D NoFiledValues[(Pcd.TokenS= paceGuidCName,Pcd.TokenCName)][0] + SkuInfo.DefaultValue =3D NoFiledValues[(Pcd.TokenS= paceGuidCName, Pcd.TokenCName)][0] else: - SkuInfo.HiiDefaultValue =3D NoFiledValues[(Pcd.Tok= enSpaceGuidCName,Pcd.TokenCName)][0] + SkuInfo.HiiDefaultValue =3D NoFiledValues[(Pcd.Tok= enSpaceGuidCName, Pcd.TokenCName)][0] for defaultstore in SkuInfo.DefaultStoreDict: - SkuInfo.DefaultStoreDict[defaultstore] =3D NoF= iledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0] + SkuInfo.DefaultStoreDict[defaultstore] =3D NoF= iledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0] else: - PcdInDec =3D self.DecPcds.get((Name,Guid)) + PcdInDec =3D self.DecPcds.get((Name, Guid)) if PcdInDec: if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_= FIXED_AT_BUILD], self._PCD_TYPE_STRING_[MODEL_PCD_P= ATCHABLE_IN_MODULE]]: self.Pcds[Name, Guid] =3D copy.deepcopy(PcdInDec) - self.Pcds[Name, Guid].DefaultValue =3D NoFiledValu= es[( Guid,Name)][0] + self.Pcds[Name, Guid].DefaultValue =3D NoFiledValu= es[( Guid, Name)][0] return AllPcds def UpdateStructuredPcds(self, TypeList, AllPcds): =20 @@ -1198,7 +1198,7 @@ class DscBuildData(PlatformBuildClassObject): for Type in TypeList: RecordList.extend(self._RawData[Type, self._Arch]) =20 - for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, default_stor= e, Dummy4,Dummy5 in RecordList: + for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, default_stor= e, Dummy4, Dummy5 in RecordList: SkuName =3D SkuName.upper() default_store =3D default_store.upper() SkuName =3D 'DEFAULT' if SkuName =3D=3D 'COMMON' else SkuName @@ -1206,7 +1206,7 @@ class DscBuildData(PlatformBuildClassObject): continue =20 if SkuName in SkuIds and "." in TokenSpaceGuid: - S_PcdSet.append([ TokenSpaceGuid.split(".")[0],TokenSpaceG= uid.split(".")[1], PcdCName,SkuName, default_store,Dummy5, AnalyzePcdExpres= sion(Setting)[0]]) + S_PcdSet.append([ TokenSpaceGuid.split(".")[0], TokenSpace= Guid.split(".")[1], PcdCName, SkuName, default_store, Dummy5, AnalyzePcdExp= ression(Setting)[0]]) =20 # handle pcd value override StrPcdSet =3D self.GetStructurePcdInfo(S_PcdSet) @@ -1217,7 +1217,7 @@ class DscBuildData(PlatformBuildClassObject): if not isinstance (str_pcd_dec, StructurePcd): EdkLogger.error('build', PARSER_ERROR, "Pcd (%s.%s) is not declared as Structure PCD = in DEC files. Arch: ['%s']" % (str_pcd[0], str_pcd[1], self._Arch), - File=3Dself.MetaFile,Line =3D StrPcdSet[str_pc= d][0][5]) + File=3Dself.MetaFile, Line =3D StrPcdSet[str_p= cd][0][5]) if str_pcd_dec: str_pcd_obj_str =3D StructurePcd() str_pcd_obj_str.copy(str_pcd_dec) @@ -1226,12 +1226,12 @@ class DscBuildData(PlatformBuildClassObject): str_pcd_obj_str.DefaultFromDSC =3D str_pcd_obj_str.Default= Value for str_pcd_data in StrPcdSet[str_pcd]: if str_pcd_data[3] in SkuIds: - str_pcd_obj_str.AddOverrideValue(str_pcd_data[2], = str(str_pcd_data[6]), 'DEFAULT' if str_pcd_data[3] =3D=3D 'COMMON' else str= _pcd_data[3],'STANDARD' if str_pcd_data[4] =3D=3D 'COMMON' else str_pcd_dat= a[4], self.MetaFile.File,LineNo=3Dstr_pcd_data[5]) + str_pcd_obj_str.AddOverrideValue(str_pcd_data[2], = str(str_pcd_data[6]), 'DEFAULT' if str_pcd_data[3] =3D=3D 'COMMON' else str= _pcd_data[3], 'STANDARD' if str_pcd_data[4] =3D=3D 'COMMON' else str_pcd_da= ta[4], self.MetaFile.File, LineNo=3Dstr_pcd_data[5]) S_pcd_set[str_pcd[1], str_pcd[0]] =3D str_pcd_obj_str else: EdkLogger.error('build', PARSER_ERROR, "Pcd (%s.%s) defined in DSC is not declared in= DEC files. Arch: ['%s']" % (str_pcd[0], str_pcd[1], self._Arch), - File=3Dself.MetaFile,Line =3D StrPcdSet[str_pc= d][0][5]) + File=3Dself.MetaFile, Line =3D StrPcdSet[str_p= cd][0][5]) # Add the Structure PCD that only defined in DEC, don't have overr= ide in DSC file for Pcd in self.DecPcds: if type (self._DecPcds[Pcd]) is StructurePcd: @@ -1279,7 +1279,7 @@ class DscBuildData(PlatformBuildClassObject): S_pcd_set =3D self.OverrideByFdfComm(S_pcd_set) Str_Pcd_Values =3D self.GenerateByteArrayValue(S_pcd_set) if Str_Pcd_Values: - for (skuname,StoreName,PcdGuid,PcdName,PcdValue) in Str_Pcd_Va= lues: + for (skuname, StoreName, PcdGuid, PcdName, PcdValue) in Str_Pc= d_Values: str_pcd_obj =3D S_pcd_set.get((PcdName, PcdGuid)) if str_pcd_obj is None: print(PcdName, PcdGuid) @@ -1331,7 +1331,7 @@ class DscBuildData(PlatformBuildClassObject): elif 'DEFAULT' in pcd.SkuInfoList.keys() and 'COMMON' in p= cd.SkuInfoList.keys(): del(pcd.SkuInfoList['COMMON']) =20 - map(self.FilterSkuSettings,[Pcds[pcdkey] for pcdkey in Pcds if Pcd= s[pcdkey].Type in DynamicPcdType]) + map(self.FilterSkuSettings, [Pcds[pcdkey] for pcdkey in Pcds if Pc= ds[pcdkey].Type in DynamicPcdType]) return Pcds =20 ## Retrieve non-dynamic PCD settings @@ -1353,7 +1353,7 @@ class DscBuildData(PlatformBuildClassObject): # Find out all possible PCD candidates for self._Arch RecordList =3D self._RawData[Type, self._Arch] PcdValueDict =3D sdict() - for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dumm= y4,Dummy5 in RecordList: + for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dumm= y4, Dummy5 in RecordList: SkuName =3D SkuName.upper() SkuName =3D 'DEFAULT' if SkuName =3D=3D 'COMMON' else SkuName if SkuName not in AvailableSkuIdSet: @@ -1404,7 +1404,7 @@ class DscBuildData(PlatformBuildClassObject): =20 return Pcds =20 - def __UNICODE2OCTList(self,Value): + def __UNICODE2OCTList(self, Value): Value =3D Value.strip() Value =3D Value[2:-1] List =3D [] @@ -1415,7 +1415,7 @@ class DscBuildData(PlatformBuildClassObject): List.append('0x00') List.append('0x00') return List - def __STRING2OCTList(self,Value): + def __STRING2OCTList(self, Value): OCTList =3D [] Value =3D Value.strip('"') for char in Value: @@ -1502,7 +1502,7 @@ class DscBuildData(PlatformBuildClassObject): CApp =3D CApp + '\n' =20 if SkuName in Pcd.SkuInfoList: - DefaultValue =3D Pcd.SkuInfoList[SkuName].DefaultStoreDict= .get(DefaultStoreName,Pcd.SkuInfoList[SkuName].HiiDefaultValue) if Pcd.SkuI= nfoList[SkuName].HiiDefaultValue else Pcd.SkuInfoList[SkuName].DefaultValue + DefaultValue =3D Pcd.SkuInfoList[SkuName].DefaultStoreDict= .get(DefaultStoreName, Pcd.SkuInfoList[SkuName].HiiDefaultValue) if Pcd.Sku= InfoList[SkuName].HiiDefaultValue else Pcd.SkuInfoList[SkuName].DefaultVal= ue else: DefaultValue =3D Pcd.DefaultValue PcdDefaultValue =3D StringToArray(DefaultValue.strip()) @@ -1593,7 +1593,7 @@ class DscBuildData(PlatformBuildClassObject): try: Value, ValueSize =3D ParseFieldValue (FieldList[Fi= eldName][0]) except Exception: - EdkLogger.error('Build', FORMAT_INVALID, "Invalid = value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName= ,Pcd.TokenCName,FieldName)),FieldList[FieldName][1], FieldList[FieldName][2= ])) + EdkLogger.error('Build', FORMAT_INVALID, "Invalid = value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName= , Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName= ][2])) if isinstance(Value, str): CApp =3D CApp + ' Pcd->%s =3D %s; // From %s Line= %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[Fie= ldName][2], FieldList[FieldName][0]) elif IsArray: @@ -1610,7 +1610,7 @@ class DscBuildData(PlatformBuildClassObject): CApp =3D CApp + ' Pcd->%s =3D %d; // From %s = Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList= [FieldName][2], FieldList[FieldName][0]) for skuname in self.SkuIdMgr.GetSkuChain(SkuName): inherit_OverrideValues =3D Pcd.SkuOverrideValues[skuname] - for FieldList in [Pcd.DefaultFromDSC,inherit_OverrideValue= s.get(DefaultStoreName)]: + for FieldList in [Pcd.DefaultFromDSC, inherit_OverrideValu= es.get(DefaultStoreName)]: if not FieldList: continue if Pcd.DefaultFromDSC and FieldList =3D=3D Pcd.Default= FromDSC: @@ -1631,7 +1631,7 @@ class DscBuildData(PlatformBuildClassObject): try: Value, ValueSize =3D ParseFieldValue (FieldLis= t[FieldName][0]) except Exception: - EdkLogger.error('Build', FORMAT_INVALID, "Inva= lid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidC= Name,Pcd.TokenCName,FieldName)),FieldList[FieldName][1], FieldList[FieldNam= e][2])) + EdkLogger.error('Build', FORMAT_INVALID, "Inva= lid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidC= Name, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[Field= Name][2])) if isinstance(Value, str): CApp =3D CApp + ' Pcd->%s =3D %s; // From %s = Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList= [FieldName][2], FieldList[FieldName][0]) elif IsArray: @@ -1834,7 +1834,7 @@ class DscBuildData(PlatformBuildClassObject): if FileLine.isdigit(): error_line =3D FileData[int (FileLine) - 1] if r"//" in error_line: - c_line,dsc_line =3D error_line.split(r"//") + c_line, dsc_line =3D error_line.split(r"//") else: dsc_line =3D error_line message_itmes =3D Message.split(":") @@ -1874,7 +1874,7 @@ class DscBuildData(PlatformBuildClassObject): for Pcd in FileBuffer: PcdValue =3D Pcd.split ('|') PcdInfo =3D PcdValue[0].split ('.') - StructurePcdSet.append((PcdInfo[0],PcdInfo[1], PcdInfo[2], Pcd= Info[3], PcdValue[2].strip())) + StructurePcdSet.append((PcdInfo[0], PcdInfo[1], PcdInfo[2], Pc= dInfo[3], PcdValue[2].strip())) return StructurePcdSet =20 ## Retrieve dynamic PCD settings @@ -1898,7 +1898,7 @@ class DscBuildData(PlatformBuildClassObject): AvailableSkuIdSet =3D copy.copy(self.SkuIds) =20 =20 - for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dumm= y4,Dummy5 in RecordList: + for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dumm= y4, Dummy5 in RecordList: SkuName =3D SkuName.upper() SkuName =3D 'DEFAULT' if SkuName =3D=3D 'COMMON' else SkuName if SkuName not in AvailableSkuIdSet: @@ -1960,7 +1960,7 @@ class DscBuildData(PlatformBuildClassObject): elif 'DEFAULT' in pcd.SkuInfoList.keys() and 'COMMON' in pcd.S= kuInfoList.keys(): del(pcd.SkuInfoList['COMMON']) =20 - map(self.FilterSkuSettings,Pcds.values()) + map(self.FilterSkuSettings, Pcds.values()) =20 return Pcds =20 @@ -1990,10 +1990,10 @@ class DscBuildData(PlatformBuildClassObject): return True else: return False - def CompletePcdValues(self,PcdSet): + def CompletePcdValues(self, PcdSet): Pcds =3D {} DefaultStoreObj =3D DefaultStore(self._GetDefaultStores()) - SkuIds =3D {skuname:skuid for skuname,skuid in self.SkuIdMgr.Avail= ableSkuIdSet.items() if skuname !=3D'COMMON'} + SkuIds =3D {skuname:skuid for skuname, skuid in self.SkuIdMgr.Avai= lableSkuIdSet.items() if skuname !=3D'COMMON'} DefaultStores =3D set([storename for pcdobj in PcdSet.values() for= skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStore= Dict.keys()]) for PcdCName, TokenSpaceGuid in PcdSet: PcdObj =3D PcdSet[(PcdCName, TokenSpaceGuid)] @@ -2014,7 +2014,7 @@ class DscBuildData(PlatformBuildClassObject): if defaultstorename not in skuobj.DefaultStoreDict: skuobj.DefaultStoreDict[defaultstorename] =3D = copy.deepcopy(skuobj.DefaultStoreDict[mindefaultstorename]) skuobj.HiiDefaultValue =3D skuobj.DefaultStoreDict[min= defaultstorename] - for skuname,skuid in SkuIds.items(): + for skuname, skuid in SkuIds.items(): if skuname not in PcdObj.SkuInfoList: nextskuid =3D self.SkuIdMgr.GetNextSkuId(skuname) while nextskuid not in PcdObj.SkuInfoList: @@ -2048,7 +2048,7 @@ class DscBuildData(PlatformBuildClassObject): AvailableSkuIdSet =3D copy.copy(self.SkuIds) DefaultStoresDefine =3D self._GetDefaultStores() =20 - for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, DefaultStore= , Dummy4,Dummy5 in RecordList: + for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, DefaultStore= , Dummy4, Dummy5 in RecordList: SkuName =3D SkuName.upper() SkuName =3D 'DEFAULT' if SkuName =3D=3D 'COMMON' else SkuName DefaultStore =3D DefaultStore.upper() @@ -2061,14 +2061,14 @@ class DscBuildData(PlatformBuildClassObject): EdkLogger.error('build', PARAMETER_INVALID, 'DefaultStores= %s is not defined in [DefaultStores] section' % DefaultStore, File=3Dself.MetaFile, Line=3DD= ummy5) if "." not in TokenSpaceGuid: - PcdSet.add((PcdCName, TokenSpaceGuid, SkuName,DefaultStore= , Dummy5)) - PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid,DefaultStore] = =3D Setting + PcdSet.add((PcdCName, TokenSpaceGuid, SkuName, DefaultStor= e, Dummy5)) + PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid, DefaultStore]= =3D Setting =20 =20 # Remove redundant PCD candidates, per the ARCH and SKU - for PcdCName, TokenSpaceGuid, SkuName,DefaultStore, Dummy4 in PcdS= et: + for PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy4 in Pcd= Set: =20 - Setting =3D PcdDict[self._Arch, SkuName, PcdCName, TokenSpaceG= uid,DefaultStore] + Setting =3D PcdDict[self._Arch, SkuName, PcdCName, TokenSpaceG= uid, DefaultStore] if Setting =3D=3D None: continue VariableName, VariableGuid, VariableOffset, DefaultValue, VarA= ttribute =3D self._ValidatePcd(PcdCName, TokenSpaceGuid, Setting, Type, Dum= my4) @@ -2112,10 +2112,10 @@ class DscBuildData(PlatformBuildClassObject): Skuitem =3D pcdObject.SkuInfoList[SkuName] Skuitem.DefaultStoreDict.update({DefaultStore:DefaultV= alue}) else: - SkuInfo =3D SkuInfoClass(SkuName, self.SkuIds[SkuName]= [0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttr= ibute=3DVarAttribute,DefaultStore=3D{DefaultStore:DefaultValue}) + SkuInfo =3D SkuInfoClass(SkuName, self.SkuIds[SkuName]= [0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttr= ibute=3DVarAttribute, DefaultStore=3D{DefaultStore:DefaultValue}) pcdObject.SkuInfoList[SkuName] =3D SkuInfo else: - SkuInfo =3D SkuInfoClass(SkuName, self.SkuIds[SkuName][0],= VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribut= e=3DVarAttribute,DefaultStore=3D{DefaultStore:DefaultValue}) + SkuInfo =3D SkuInfoClass(SkuName, self.SkuIds[SkuName][0],= VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribut= e=3DVarAttribute, DefaultStore=3D{DefaultStore:DefaultValue}) Pcds[PcdCName, TokenSpaceGuid] =3D PcdClassObject( PcdCName, TokenSpaceGuid, @@ -2142,7 +2142,7 @@ class DscBuildData(PlatformBuildClassObject): sku.HiiDefaultValue =3D pcdDecObject.DefaultValue if 'DEFAULT' not in pcd.SkuInfoList.keys() and 'COMMON' not in= pcd.SkuInfoList.keys(): valuefromDec =3D pcdDecObject.DefaultValue - SkuInfo =3D SkuInfoClass('DEFAULT', '0', SkuInfoObj.Variab= leName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset, valuefromDec,Va= riableAttribute=3DSkuInfoObj.VariableAttribute,DefaultStore=3D{DefaultStore= :valuefromDec}) + SkuInfo =3D SkuInfoClass('DEFAULT', '0', SkuInfoObj.Variab= leName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset, valuefromDec, V= ariableAttribute=3DSkuInfoObj.VariableAttribute, DefaultStore=3D{DefaultSto= re:valuefromDec}) pcd.SkuInfoList['DEFAULT'] =3D SkuInfo elif 'DEFAULT' not in pcd.SkuInfoList.keys() and 'COMMON' in p= cd.SkuInfoList.keys(): pcd.SkuInfoList['DEFAULT'] =3D pcd.SkuInfoList['COMMON'] @@ -2170,19 +2170,19 @@ class DscBuildData(PlatformBuildClassObject): invalidpcd =3D ",".join(invalidhii) EdkLogger.error('build', PCD_VARIABLE_INFO_ERROR, Message=3D'T= he same HII PCD must map to the same EFI variable for all SKUs', File=3Dsel= f.MetaFile, ExtraData=3Dinvalidpcd) =20 - map(self.FilterSkuSettings,Pcds.values()) + map(self.FilterSkuSettings, Pcds.values()) =20 return Pcds =20 - def CheckVariableNameAssignment(self,Pcds): + def CheckVariableNameAssignment(self, Pcds): invalidhii =3D [] for pcdname in Pcds: pcd =3D Pcds[pcdname] - varnameset =3D set([sku.VariableName for (skuid,sku) in pcd.Sk= uInfoList.items()]) + varnameset =3D set([sku.VariableName for (skuid, sku) in pcd.S= kuInfoList.items()]) if len(varnameset) > 1: - invalidhii.append(".".join((pcdname[1],pcdname[0]))) + invalidhii.append(".".join((pcdname[1], pcdname[0]))) if len(invalidhii): - return False,invalidhii + return False, invalidhii else: return True, [] ## Retrieve dynamic VPD PCD settings @@ -2206,7 +2206,7 @@ class DscBuildData(PlatformBuildClassObject): RecordList =3D self._RawData[Type, self._Arch] AvailableSkuIdSet =3D copy.copy(self.SkuIds) =20 - for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dumm= y4,Dummy5 in RecordList: + for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dumm= y4, Dummy5 in RecordList: SkuName =3D SkuName.upper() SkuName =3D 'DEFAULT' if SkuName =3D=3D 'COMMON' else SkuName if SkuName not in AvailableSkuIdSet: @@ -2273,7 +2273,7 @@ class DscBuildData(PlatformBuildClassObject): del(pcd.SkuInfoList['COMMON']) =20 =20 - map(self.FilterSkuSettings,Pcds.values()) + map(self.FilterSkuSettings, Pcds.values()) return Pcds =20 ## Add external modules @@ -2338,7 +2338,7 @@ class DscBuildData(PlatformBuildClassObject): continue ModuleData =3D self._Bdb[ModuleFile, self._Arch, self._Tar= get, self._Toolchain] PkgSet.update(ModuleData.Packages) - self._DecPcds, self._GuidDict =3D GetDeclaredPcd(self, self._B= db, self._Arch, self._Target, self._Toolchain,PkgSet) + self._DecPcds, self._GuidDict =3D GetDeclaredPcd(self, self._B= db, self._Arch, self._Target, self._Toolchain, PkgSet) return self._DecPcds _Macros =3D property(_GetMacros) Arch =3D property(_GetArch, _SetArch) diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTool= s/Source/Python/Workspace/MetaFileParser.py index 4ad60498488b..8ceedf5aec78 100644 --- a/BaseTools/Source/Python/Workspace/MetaFileParser.py +++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py @@ -299,7 +299,7 @@ class MetaFileParser(object): for Item in GetSplitValueList(self._CurrentLine[1:-1], TAB_COMMA_S= PLIT): if Item =3D=3D '': continue - ItemList =3D GetSplitValueList(Item, TAB_SPLIT,3) + ItemList =3D GetSplitValueList(Item, TAB_SPLIT, 3) # different section should not mix in one section if self._SectionName !=3D '' and self._SectionName !=3D ItemLi= st[0].upper(): EdkLogger.error('Parser', FORMAT_INVALID, "Different secti= on names in the same section", @@ -417,7 +417,7 @@ class MetaFileParser(object): =20 ## Construct section Macro dict=20 def _ConstructSectionMacroDict(self, Name, Value): - ScopeKey =3D [(Scope[0], Scope[1],Scope[2]) for Scope in self._Sco= pe] + ScopeKey =3D [(Scope[0], Scope[1], Scope[2]) for Scope in self._Sc= ope] ScopeKey =3D tuple(ScopeKey) SectionDictKey =3D self._SectionType, ScopeKey # @@ -449,20 +449,20 @@ class MetaFileParser(object): continue =20 for ActiveScope in self._Scope: - Scope0, Scope1 ,Scope2=3D ActiveScope[0], ActiveScope[1],A= ctiveScope[2] - if(Scope0, Scope1,Scope2) not in Scope: + Scope0, Scope1, Scope2=3D ActiveScope[0], ActiveScope[1], = ActiveScope[2] + if(Scope0, Scope1, Scope2) not in Scope: break else: SpeSpeMacroDict.update(self._SectionsMacroDict[(SectionTyp= e, Scope)]) =20 for ActiveScope in self._Scope: - Scope0, Scope1,Scope2 =3D ActiveScope[0], ActiveScope[1],A= ctiveScope[2] - if(Scope0, Scope1,Scope2) not in Scope and (Scope0, "COMMO= N","COMMON") not in Scope and ("COMMON", Scope1,"COMMON") not in Scope: + Scope0, Scope1, Scope2 =3D ActiveScope[0], ActiveScope[1],= ActiveScope[2] + if(Scope0, Scope1, Scope2) not in Scope and (Scope0, "COMM= ON", "COMMON") not in Scope and ("COMMON", Scope1, "COMMON") not in Scope: break else: ComSpeMacroDict.update(self._SectionsMacroDict[(SectionTyp= e, Scope)]) =20 - if ("COMMON", "COMMON","COMMON") in Scope: + if ("COMMON", "COMMON", "COMMON") in Scope: ComComMacroDict.update(self._SectionsMacroDict[(SectionTyp= e, Scope)]) =20 Macros.update(ComComMacroDict) @@ -634,7 +634,7 @@ class InfParser(MetaFileParser): # Model, Value1, Value2, Value3, Arch, Platform, BelongsToItem= =3D-1, # LineBegin=3D-1, ColumnBegin=3D-1, LineEnd=3D-1, ColumnEnd=3D= -1, Enabled=3D-1 # - for Arch, Platform,_ in self._Scope: + for Arch, Platform, _ in self._Scope: LastItem =3D self._Store(self._SectionType, self._ValueList[0], self._ValueList[1], @@ -944,7 +944,7 @@ class DscParser(MetaFileParser): self._DirectiveParser() continue if Line[0] =3D=3D TAB_OPTION_START and not self._InSubsection: - EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the = '{' before %s in Line %s" % (Line, Index+1),ExtraData=3Dself.MetaFile) + EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the = '{' before %s in Line %s" % (Line, Index+1), ExtraData=3Dself.MetaFile) =20 if self._InSubsection: SectionType =3D self._SubsectionType @@ -1024,7 +1024,7 @@ class DscParser(MetaFileParser): ExtraData=3Dself._CurrentLine) =20 ItemType =3D self.DataType[DirectiveName] - Scope =3D [['COMMON', 'COMMON','COMMON']] + Scope =3D [['COMMON', 'COMMON', 'COMMON']] if ItemType =3D=3D MODEL_META_DATA_INCLUDE: Scope =3D self._Scope if ItemType =3D=3D MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF: @@ -1099,7 +1099,7 @@ class DscParser(MetaFileParser): @ParseMacro def _SkuIdParser(self): TokenList =3D GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT) - if len(TokenList) not in (2,3): + if len(TokenList) not in (2, 3): EdkLogger.error('Parser', FORMAT_INVALID, "Correct format is '= |[|]'", ExtraData=3Dself._CurrentLine, File=3Dself.Met= aFile, Line=3Dself._LineIndex + 1) self._ValueList[0:len(TokenList)] =3D TokenList @@ -1159,7 +1159,7 @@ class DscParser(MetaFileParser): =20 # Validate the datum type of Dynamic Defaul PCD and DynamicEx Defa= ult PCD ValueList =3D GetSplitValueList(self._ValueList[2]) - if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8 , TAB_UINT16,= TAB_UINT32 , TAB_UINT64] \ + if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8, TAB_UINT16, = TAB_UINT32, TAB_UINT64] \ and self._ItemType in [MODEL_PCD_DYNAMIC_DEF= AULT, MODEL_PCD_DYNAMIC_EX_DEFAULT]: EdkLogger.error('Parser', FORMAT_INVALID, "The datum type '%s'= of PCD is wrong" % ValueList[1], ExtraData=3Dself._CurrentLine, File=3Dself.Met= aFile, Line=3Dself._LineIndex + 1) @@ -1167,7 +1167,7 @@ class DscParser(MetaFileParser): # Validate the VariableName of DynamicHii and DynamicExHii for PCD= Entry must not be an empty string if self._ItemType in [MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_= HII]: DscPcdValueList =3D GetSplitValueList(TokenList[1], TAB_VALUE_= SPLIT, 1) - if len(DscPcdValueList[0].replace('L','').replace('"','').stri= p()) =3D=3D 0: + if len(DscPcdValueList[0].replace('L', '').replace('"', '').st= rip()) =3D=3D 0: EdkLogger.error('Parser', FORMAT_INVALID, "The VariableNam= e field in the HII format PCD entry must not be an empty string", ExtraData=3Dself._CurrentLine, File=3Dself.Met= aFile, Line=3Dself._LineIndex + 1) =20 @@ -1296,7 +1296,7 @@ class DscParser(MetaFileParser): self._ContentIndex =3D 0 self._InSubsection =3D False while self._ContentIndex < len(self._Content) : - Id, self._ItemType, V1, V2, V3, S1, S2, S3,Owner, self._From, \ + Id, self._ItemType, V1, V2, V3, S1, S2, S3, Owner, self._From,= \ LineStart, ColStart, LineEnd, ColEnd, Enabled =3D self._Co= ntent[self._ContentIndex] =20 if self._From < 0: @@ -1314,8 +1314,8 @@ class DscParser(MetaFileParser): break Record =3D self._Content[self._ContentIndex] if LineStart =3D=3D Record[10] and LineEnd =3D=3D Record[1= 2]: - if [Record[5], Record[6],Record[7]] not in self._Scope: - self._Scope.append([Record[5], Record[6],Record[7]= ]) + if [Record[5], Record[6], Record[7]] not in self._Scop= e: + self._Scope.append([Record[5], Record[6], Record[7= ]]) self._ContentIndex +=3D 1 else: break @@ -1404,7 +1404,7 @@ class DscParser(MetaFileParser): MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_DEFAUL= T, MODEL_PCD_DYNAMIC_EX_HII, MODEL_PCD_DYNAMIC_EX_VPD): Records =3D self._RawTable.Query(PcdType, BelongsToItem=3D -1.= 0) - for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4,ID,= Line in Records: + for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4, ID= , Line in Records: Name =3D TokenSpaceGuid + '.' + PcdName if Name not in GlobalData.gPlatformOtherPcds: PcdLine =3D Line @@ -1778,7 +1778,7 @@ class DecParser(MetaFileParser): if self._DefinesCount > 1: EdkLogger.error('Parser', FORMAT_INVALID, 'Multiple [Defines] = section is exist.', self.MetaFile ) if self._DefinesCount =3D=3D 0: - EdkLogger.error('Parser', FORMAT_INVALID, 'No [Defines] sectio= n exist.',self.MetaFile) + EdkLogger.error('Parser', FORMAT_INVALID, 'No [Defines] sectio= n exist.', self.MetaFile) self._Done() =20 =20 diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools= /Source/Python/Workspace/MetaFileTable.py index 92fcf6dd2b22..9416065b284f 100644 --- a/BaseTools/Source/Python/Workspace/MetaFileTable.py +++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py @@ -258,8 +258,8 @@ class PackageTable(MetaFileTable): ValidType =3D "@ValidList" if oricomment.startswith("@Expression"): ValidType =3D "@Expression" - EdkLogger.error('Parser', FORMAT_INVALID, "The syntax for %s o= f PCD %s.%s is incorrect" % (ValidType,TokenSpaceGuid, PcdCName), - ExtraData=3Doricomment,File=3Dself.MetaFile, L= ine=3DLineNum) + EdkLogger.error('Parser', FORMAT_INVALID, "The syntax for %s o= f PCD %s.%s is incorrect" % (ValidType, TokenSpaceGuid, PcdCName), + ExtraData=3Doricomment, File=3Dself.MetaFile, = Line=3DLineNum) return set(), set(), set() return set(validateranges), set(validlists), set(expressions) ## Python class representation of table storing platform data @@ -308,7 +308,7 @@ class PlatformTable(MetaFileTable): # def Insert(self, Model, Value1, Value2, Value3, Scope1=3D'COMMON', Sco= pe2=3D'COMMON', Scope3=3DTAB_DEFAULT_STORES_DEFAULT,BelongsToItem=3D-1, FromItem=3D-1, StartLine=3D-1, StartColumn=3D-1, EndLine=3D= -1, EndColumn=3D-1, Enabled=3D1): - (Value1, Value2, Value3, Scope1, Scope2,Scope3) =3D ConvertToSqlSt= ring((Value1, Value2, Value3, Scope1, Scope2,Scope3)) + (Value1, Value2, Value3, Scope1, Scope2, Scope3) =3D ConvertToSqlS= tring((Value1, Value2, Value3, Scope1, Scope2, Scope3)) return Table.Insert( self,=20 Model,=20 diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/BaseToo= ls/Source/Python/Workspace/WorkspaceCommon.py index c760e57b8f64..6b5e0edb0a4d 100644 --- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py +++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py @@ -45,7 +45,7 @@ def GetPackageList(Platform, BuildDatabase, Arch, Target,= Toolchain): # @retval: A dictionary contains instances of PcdClassObject with key (Pc= dCName, TokenSpaceGuid) # @retval: A dictionary contains real GUIDs of TokenSpaceGuid # -def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain,additi= onalPkgs): +def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain, addit= ionalPkgs): PkgList =3D GetPackageList(Platform, BuildDatabase, Arch, Target, Tool= chain) PkgList =3D set(PkgList) PkgList |=3D additionalPkgs diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Sourc= e/Python/build/BuildReport.py index e71c0abc25b9..aa357e4ed62b 100644 --- a/BaseTools/Source/Python/build/BuildReport.py +++ b/BaseTools/Source/Python/build/BuildReport.py @@ -1213,16 +1213,16 @@ class PcdReport(object): else: if IsByteArray: if self.SkuSingle: - FileWrite(File, ' %-*s : %6s %10s =3D %s= ' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', "{")) + FileWrite(File, ' %-*s : %6s %10s =3D %s= ' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', "{")) else: - FileWrite(File, ' %-*s : %6s %10s %10s = =3D %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + Sk= uIdName + ')', "{")) + FileWrite(File, ' %-*s : %6s %10s %10s = =3D %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + Sku= IdName + ')', "{")) for Array in ArrayList: FileWrite(File, '%s' % (Array)) else: if self.SkuSingle: - FileWrite(File, ' %-*s : %6s %10s =3D %s= ' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', Value)) + FileWrite(File, ' %-*s : %6s %10s =3D %s= ' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', Value)) else: - FileWrite(File, ' %-*s : %6s %10s %10s = =3D %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + Sk= uIdName + ')', Value)) + FileWrite(File, ' %-*s : %6s %10s %10s = =3D %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + Sku= IdName + ')', Value)) if TypeName in ('DYNVPD', 'DEXVPD'): FileWrite(File, '%*s' % (self.MaxLen + 4, SkuInfo.= VpdOffset)) if IsStructure: diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Pyth= on/build/build.py index c6a37ab1d9a3..6fbaad4c0fb6 100644 --- a/BaseTools/Source/Python/build/build.py +++ b/BaseTools/Source/Python/build/build.py @@ -838,7 +838,7 @@ class Build(): self.HashSkipModules =3D [] self.Db_Flag =3D False self.LaunchPrebuildFlag =3D False - self.PlatformBuildPath =3D os.path.join(GlobalData.gConfDirectory,= '.cache', '.PlatformBuild') + self.PlatformBuildPath =3D os.path.join(GlobalData.gConfDirectory,= '.cache', '.PlatformBuild') if BuildOptions.CommandLength: GlobalData.gCommandMaxLength =3D BuildOptions.CommandLength =20 @@ -1131,7 +1131,7 @@ class Build(): # and preserve them for the rest of the main build step, becau= se the child process environment will # evaporate as soon as it exits, we cannot get it in build ste= p. # - PrebuildEnvFile =3D os.path.join(GlobalData.gConfDirectory,'.c= ache','.PrebuildEnv') + PrebuildEnvFile =3D os.path.join(GlobalData.gConfDirectory, '.= cache', '.PrebuildEnv') if os.path.isfile(PrebuildEnvFile): os.remove(PrebuildEnvFile) if os.path.isfile(self.PlatformBuildPath): @@ -1171,7 +1171,7 @@ class Build(): f =3D open(PrebuildEnvFile) envs =3D f.readlines() f.close() - envs =3D itertools.imap(lambda l: l.split('=3D',1), envs) + envs =3D itertools.imap(lambda l: l.split('=3D', 1), envs) envs =3D itertools.ifilter(lambda l: len(l) =3D=3D 2, envs) envs =3D itertools.imap(lambda l: [i.strip() for i in l], = envs) os.environ.update(dict(envs)) @@ -2352,7 +2352,7 @@ def MyOptionParser(): Parser.add_option("-D", "--define", action=3D"append", type=3D"string"= , dest=3D"Macros", help=3D"Macro: \"Name [=3D Value]\".") =20 Parser.add_option("-y", "--report-file", action=3D"store", dest=3D"Rep= ortFile", help=3D"Create/overwrite the report to the specified filename.") - Parser.add_option("-Y", "--report-type", action=3D"append", type=3D"ch= oice", choices=3D['PCD','LIBRARY','FLASH','DEPEX','BUILD_FLAGS','FIXED_ADDR= ESS','HASH','EXECUTION_ORDER'], dest=3D"ReportType", default=3D[], + Parser.add_option("-Y", "--report-type", action=3D"append", type=3D"ch= oice", choices=3D['PCD', 'LIBRARY', 'FLASH', 'DEPEX', 'BUILD_FLAGS', 'FIXED= _ADDRESS', 'HASH', 'EXECUTION_ORDER'], dest=3D"ReportType", default=3D[], help=3D"Flags that control the type of build report to generate. = Must be one of: [PCD, LIBRARY, FLASH, DEPEX, BUILD_FLAGS, FIXED_ADDRESS, HA= SH, EXECUTION_ORDER]. "\ "To specify more than one flag, repeat this option on the com= mand line and the default flag set is [PCD, LIBRARY, FLASH, DEPEX, HASH, BU= ILD_FLAGS, FIXED_ADDRESS]") Parser.add_option("-F", "--flag", action=3D"store", type=3D"string", d= est=3D"Flag", diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py index 1cf2ce13be2b..1eafecefbacd 100644 --- a/BaseTools/Tests/TestTools.py +++ b/BaseTools/Tests/TestTools.py @@ -161,7 +161,7 @@ class BaseToolsTest(unittest.TestCase): if minlen is None: minlen =3D 1024 if maxlen is None: maxlen =3D minlen return ''.join( - [chr(random.randint(0,255)) + [chr(random.randint(0, 255)) for x in range(random.randint(minlen, maxlen)) ]) =20 diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-bui= ld.py index 49ff656c066f..3bf524123d0f 100755 --- a/BaseTools/gcc/mingw-gcc-build.py +++ b/BaseTools/gcc/mingw-gcc-build.py @@ -187,7 +187,7 @@ class Config: return path =20 def MakeDirs(self): - for path in (self.src_dir, self.build_dir,self.prefix, self.symlin= ks): + for path in (self.src_dir, self.build_dir, self.prefix, self.symli= nks): if not os.path.exists(path): os.makedirs(path) =20 --=20 2.16.1 _______________________________________________ edk2-devel mailing list edk2-devel@lists.01.org https://lists.01.org/mailman/listinfo/edk2-devel