Cooled turbine blade#
This notebook demonstrates the process of using the Workbench client to upload project files, run scripts, start services, and handle output files. It also includes launching PyMechanical to solve models and visualize results.
First, import the necessary modules. We import pathlib
for handling filesystem paths and os
for interacting with the operating system. The launch_workbench
function from ansys.workbench.core
is imported to start a Workbench session, and connect_to_mechanical
from ansys.mechanical.core
to start a Mechanical session.
[1]:
import os
import pathlib
[2]:
from ansys.workbench.core import launch_workbench
from ansys.mechanical.core import connect_to_mechanical
Launch the Workbench service on the local machine, using some options. Define several directories that will be used during the session. workdir
is set to the parent directory of the current file. assets
, scripts
, and wbpz
are subdirectories within the working directory. The launch_workbench
function is called to start a Workbench session with specified directory.
[3]:
workdir = pathlib.Path("__file__").parent
[4]:
assets = workdir / "assets"
scripts = workdir / "scripts"
[5]:
wb = launch_workbench(show_gui=True,client_workdir=str(workdir.absolute()))
Upload the project files to the server using the upload_file_from_example_repo
method. The file to upload is cooled_turbine_blade.wbpz
.
[6]:
wb.upload_file_from_example_repo("cooled-turbine-blade/wbpz/cooled_turbine_blade.wbpz")
Uploading cooled_turbine_blade.wbpz: 100%|██████████| 1.18M/1.18M [00:00<00:00, 26.4MB/s]
Execute a Workbench script (project.wbjn
) to define the project and load the geometry using the run_script_file
method. The set_log_file
method is used to direct the logs to wb_log_file.log
. The name of the system created is stored in sys_name
and printed.
[7]:
export_path = 'wb_log_file.log'
wb.set_log_file(export_path)
sys_name = wb.run_script_file(str((assets / "project.wbjn").absolute()), log_level='info')
print(sys_name)
SYS
Start a PyMechanical server for the system using the start_mechanical_server
method. Create a PyMechanical client session connected to this server using connect_to_mechanical
. The project directory is printed to verify the connection.
[8]:
server_port = wb.start_mechanical_server(system_name=sys_name)
[9]:
mechanical = connect_to_mechanical(ip='localhost', port=server_port)
[10]:
print(mechanical.project_directory)
C:\Users\ansys\AppData\Local\Tempwbpj\example_02_Cooled_Turbine_Blade_files\
Read and execute the script cooled_turbine_blade.py
via the PyMechanical client using run_python_script
. This script typically contains commands to mesh and solve the turbine blade model. The output of the script is printed.
[11]:
with open(scripts / "cooled_turbine_blade.py") as sf:
mech_script = sf.read()
mech_output = mechanical.run_python_script(mech_script)
print(mech_output)
{"Stress": "2802182020.5917487 [Pa]"}
Specify the Mechanical directory and run a script to fetch the working directory path. The path where all solver files are stored on the server is printed. Download the solver output file (solve.out
) from the server to the client’s current working directory and print its contents.
[12]:
mechanical.run_python_script(f"solve_dir=ExtAPI.DataModel.AnalysisList[1].WorkingDir")
[12]:
''
[13]:
result_solve_dir_server = mechanical.run_python_script(f"solve_dir")
print(f"All solver files are stored on the server at: {result_solve_dir_server}")
All solver files are stored on the server at: C:\Users\ansys\AppData\Local\Tempwbpj\example_02_Cooled_Turbine_Blade_files\dp0\SYS-8\MECH\
[14]:
solve_out_path = os.path.join(result_solve_dir_server, "solve.out")
[15]:
def write_file_contents_to_console(path):
"""Write file contents to console."""
with open(path, "rt") as file:
for line in file:
print(line, end="")
[16]:
current_working_directory = os.getcwd()
mechanical.download(solve_out_path, target_dir=current_working_directory)
solve_out_local_path = os.path.join(current_working_directory, "solve.out")
write_file_contents_to_console(solve_out_local_path)
os.remove(solve_out_local_path)
Downloading dns:///127.0.0.1:56164:C:\Users\ansys\AppData\Local\Tempwbpj\example_02_Cooled_Turbine_Blade_files\dp0\SYS-8\MECH\solve.out to C:\Users\ansys\actions-runner\_work\pyworkbench-examples\pyworkbench-examples\pyworkbench-examples\doc\source\examples\cooled-turbine-blade\solve.out: 100%|██████████| 32.6k/32.6k [00:00<?, ?B/s]
Ansys Mechanical Enterprise
*------------------------------------------------------------------*
| |
| W E L C O M E T O T H E A N S Y S (R) P R O G R A M |
| |
*------------------------------------------------------------------*
***************************************************************
* ANSYS MAPDL 2024 R2 LEGAL NOTICES *
***************************************************************
* *
* Copyright 1971-2024 Ansys, Inc. All rights reserved. *
* Unauthorized use, distribution or duplication is *
* prohibited. *
* *
* Ansys is a registered trademark of Ansys, Inc. or its *
* subsidiaries in the United States or other countries. *
* See the Ansys, Inc. online documentation or the Ansys, Inc. *
* documentation CD or online help for the complete Legal *
* Notice. *
* *
***************************************************************
* *
* THIS ANSYS SOFTWARE PRODUCT AND PROGRAM DOCUMENTATION *
* INCLUDE TRADE SECRETS AND CONFIDENTIAL AND PROPRIETARY *
* PRODUCTS OF ANSYS, INC., ITS SUBSIDIARIES, OR LICENSORS. *
* The software products and documentation are furnished by *
* Ansys, Inc. or its subsidiaries under a software license *
* agreement that contains provisions concerning *
* non-disclosure, copying, length and nature of use, *
* compliance with exporting laws, warranties, disclaimers, *
* limitations of liability, and remedies, and other *
* provisions. The software products and documentation may be *
* used, disclosed, transferred, or copied only in accordance *
* with the terms and conditions of that software license *
* agreement. *
* *
* Ansys, Inc. is a UL registered *
* ISO 9001:2015 company. *
* *
***************************************************************
* *
* This product is subject to U.S. laws governing export and *
* re-export. *
* *
* For U.S. Government users, except as specifically granted *
* by the Ansys, Inc. software license agreement, the use, *
* duplication, or disclosure by the United States Government *
* is subject to restrictions stated in the Ansys, Inc. *
* software license agreement and FAR 12.212 (for non-DOD *
* licenses). *
* *
***************************************************************
2024 R2
Point Releases and Patches installed:
Ansys Service Pack 2024 R2.01
Ansys Service Pack 2024 R2.02
Ansys Service Pack 2024 R2.03
Ansys, Inc. License Manager 2024 R2
Ansys, Inc. License Manager 2024 R2.01
Ansys, Inc. License Manager 2024 R2.02
Ansys, Inc. License Manager 2024 R2.03
Discovery 2024 R2
Discovery 2024 R2.01
Discovery 2024 R2.02
Discovery 2024 R2.03
Core WB Files 2024 R2
Core WB Files 2024 R2.01
Core WB Files 2024 R2.02
Core WB Files 2024 R2.03
SpaceClaim 2024 R2
SpaceClaim 2024 R2.01
SpaceClaim 2024 R2.02
SpaceClaim 2024 R2.03
Icepak (includes CFD-Post) 2024 R2
Icepak (includes CFD-Post) 2024 R2.01
Icepak (includes CFD-Post) 2024 R2.02
Icepak (includes CFD-Post) 2024 R2.03
CFD-Post only 2024 R2
CFD-Post only 2024 R2.01
CFD-Post only 2024 R2.02
CFD-Post only 2024 R2.03
CFX (includes CFD-Post) 2024 R2
CFX (includes CFD-Post) 2024 R2.01
CFX (includes CFD-Post) 2024 R2.02
CFX (includes CFD-Post) 2024 R2.03
Chemkin 2024 R2
Chemkin 2024 R2.01
Chemkin 2024 R2.02
Chemkin 2024 R2.03
EnSight 2024 R2
EnSight 2024 R2.01
EnSight 2024 R2.02
EnSight 2024 R2.03
FENSAP-ICE 2024 R2
FENSAP-ICE 2024 R2.01
FENSAP-ICE 2024 R2.02
FENSAP-ICE 2024 R2.03
Fluent (includes CFD-Post) 2024 R2
Fluent (includes CFD-Post) 2024 R2.01
Fluent (includes CFD-Post) 2024 R2.02
Fluent (includes CFD-Post) 2024 R2.03
Polyflow (includes CFD-Post) 2024 R2
Polyflow (includes CFD-Post) 2024 R2.01
Polyflow (includes CFD-Post) 2024 R2.02
Polyflow (includes CFD-Post) 2024 R2.03
Forte (includes EnSight) 2024 R2
Forte (includes EnSight) 2024 R2.01
Forte (includes EnSight) 2024 R2.02
Forte (includes EnSight) 2024 R2.03
ICEM CFD 2024 R2
ICEM CFD 2024 R2.01
ICEM CFD 2024 R2.02
ICEM CFD 2024 R2.03
TurboGrid 2024 R2
TurboGrid 2024 R2.01
TurboGrid 2024 R2.02
TurboGrid 2024 R2.03
Speos 2024 R2
Speos 2024 R2.01
Speos 2024 R2.02
Speos 2024 R2.03
Speos HPC 2024 R2
Speos HPC 2024 R2.01
Speos HPC 2024 R2.02
Speos HPC 2024 R2.03
optiSLang 2024 R2
optiSLang 2024 R2.01
optiSLang 2024 R2.02
optiSLang 2024 R2.03
Remote Solve Manager Standalone Services 2024 R2
Remote Solve Manager Standalone Services 2024 R2.01
Remote Solve Manager Standalone Services 2024 R2.02
Remote Solve Manager Standalone Services 2024 R2.03
Additive 2024 R2
Additive 2024 R2.01
Additive 2024 R2.02
Additive 2024 R2.03
Aqwa 2024 R2
Aqwa 2024 R2.01
Aqwa 2024 R2.02
Aqwa 2024 R2.03
Autodyn 2024 R2
Autodyn 2024 R2.01
Autodyn 2024 R2.02
Autodyn 2024 R2.03
Customization Files for User Programmable Features 2024 R2
Customization Files for User Programmable Features 2024 R2.01
Customization Files for User Programmable Features 2024 R2.02
Customization Files for User Programmable Features 2024 R2.03
LS-DYNA 2024 R2
LS-DYNA 2024 R2.01
LS-DYNA 2024 R2.02
LS-DYNA 2024 R2.03
Mechanical Products 2024 R2
Mechanical Products 2024 R2.01
Mechanical Products 2024 R2.02
Mechanical Products 2024 R2.03
Motion 2024 R2
Motion 2024 R2.01
Motion 2024 R2.02
Motion 2024 R2.03
Sherlock 2024 R2
Sherlock 2024 R2.01
Sherlock 2024 R2.02
Sherlock 2024 R2.03
Sound - SAS 2024 R2
Sound - SAS 2024 R2.01
Sound - SAS 2024 R2.02
Sound - SAS 2024 R2.03
ACIS Geometry Interface 2024 R2
ACIS Geometry Interface 2024 R2.01
ACIS Geometry Interface 2024 R2.02
ACIS Geometry Interface 2024 R2.03
AutoCAD Geometry Interface 2024 R2
AutoCAD Geometry Interface 2024 R2.01
AutoCAD Geometry Interface 2024 R2.02
AutoCAD Geometry Interface 2024 R2.03
Catia, Version 4 Geometry Interface 2024 R2
Catia, Version 4 Geometry Interface 2024 R2.01
Catia, Version 4 Geometry Interface 2024 R2.02
Catia, Version 4 Geometry Interface 2024 R2.03
Catia, Version 5 Geometry Interface 2024 R2
Catia, Version 5 Geometry Interface 2024 R2.01
Catia, Version 5 Geometry Interface 2024 R2.02
Catia, Version 5 Geometry Interface 2024 R2.03
Catia, Version 6 Geometry Interface 2024 R2
Catia, Version 6 Geometry Interface 2024 R2.01
Catia, Version 6 Geometry Interface 2024 R2.02
Catia, Version 6 Geometry Interface 2024 R2.03
Creo Elements/Direct Modeling Geometry Interface 2024 R2
Creo Elements/Direct Modeling Geometry Interface 2024 R2.01
Creo Elements/Direct Modeling Geometry Interface 2024 R2.02
Creo Elements/Direct Modeling Geometry Interface 2024 R2.03
Creo Parametric Geometry Interface 2024 R2
Creo Parametric Geometry Interface 2024 R2.01
Creo Parametric Geometry Interface 2024 R2.02
Creo Parametric Geometry Interface 2024 R2.03
Inventor Geometry Interface 2024 R2
Inventor Geometry Interface 2024 R2.01
Inventor Geometry Interface 2024 R2.02
Inventor Geometry Interface 2024 R2.03
JTOpen Geometry Interface 2024 R2
JTOpen Geometry Interface 2024 R2.01
JTOpen Geometry Interface 2024 R2.02
JTOpen Geometry Interface 2024 R2.03
NX Geometry Interface 2024 R2
NX Geometry Interface 2024 R2.01
NX Geometry Interface 2024 R2.02
NX Geometry Interface 2024 R2.03
Parasolid Geometry Interface 2024 R2
Parasolid Geometry Interface 2024 R2.01
Parasolid Geometry Interface 2024 R2.02
Parasolid Geometry Interface 2024 R2.03
Solid Edge Geometry Interface 2024 R2
Solid Edge Geometry Interface 2024 R2.01
Solid Edge Geometry Interface 2024 R2.02
Solid Edge Geometry Interface 2024 R2.03
SOLIDWORKS Geometry Interface 2024 R2
SOLIDWORKS Geometry Interface 2024 R2.01
SOLIDWORKS Geometry Interface 2024 R2.02
SOLIDWORKS Geometry Interface 2024 R2.03
***** MAPDL COMMAND LINE ARGUMENTS *****
BATCH MODE REQUESTED (-b) = NOLIST
INPUT FILE COPY MODE (-c) = COPY
DISTRIBUTED MEMORY PARALLEL REQUESTED
4 PARALLEL PROCESSES REQUESTED WITH SINGLE THREAD PER PROCESS
TOTAL OF 4 CORES REQUESTED
INPUT FILE NAME = C:\Users\ansys\AppData\Local\Tempwbpj\_ProjectScratch\Scr5712\dummy.dat
OUTPUT FILE NAME = C:\Users\ansys\AppData\Local\Tempwbpj\_ProjectScratch\Scr5712\solve.out
START-UP FILE MODE = NOREAD
STOP FILE MODE = NOREAD
RELEASE= 2024 R2 BUILD= 24.2 UP20240603 VERSION=WINDOWS x64
CURRENT JOBNAME=file0 18:02:08 JAN 08, 2025 CP= 0.094
PARAMETER _DS_PROGRESS = 999.0000000
/INPUT FILE= ds.dat LINE= 0
*** NOTE *** CP = 0.422 TIME= 18:02:09
The /CONFIG,NOELDB command is not valid in a distributed memory
parallel solution. Command is ignored.
*GET _WALLSTRT FROM ACTI ITEM=TIME WALL VALUE= 18.0358333
TITLE=
example_02_Cooled_Turbine_Blade--Static Structural (C5)
ACT Extensions:
LSDYNA, 2024.2
5f463412-bd3e-484b-87e7-cbc0a665e474, wbex
/COM, ANSYSMotion, 2024.2
20180725-3f81-49eb-9f31-41364844c769, wbex
SET PARAMETER DIMENSIONS ON _WB_PROJECTSCRATCH_DIR
TYPE=STRI DIMENSIONS= 248 1 1
PARAMETER _WB_PROJECTSCRATCH_DIR(1) = C:\Users\ansys\AppData\Local\Tempwbpj\_ProjectScratch\Scr5712\
SET PARAMETER DIMENSIONS ON _WB_SOLVERFILES_DIR
TYPE=STRI DIMENSIONS= 248 1 1
PARAMETER _WB_SOLVERFILES_DIR(1) = C:\Users\ansys\AppData\Local\Tempwbpj\example_02_Cooled_Turbine_Blade_files\dp0\SYS-8\MECH\
SET PARAMETER DIMENSIONS ON _WB_USERFILES_DIR
TYPE=STRI DIMENSIONS= 248 1 1
PARAMETER _WB_USERFILES_DIR(1) = C:\Users\ansys\AppData\Local\Tempwbpj\example_02_Cooled_Turbine_Blade_files\user_files\
--- Data in consistent MKS units. See Solving Units in the help system for more
MKS UNITS SPECIFIED FOR INTERNAL
LENGTH (l) = METER (M)
MASS (M) = KILOGRAM (KG)
TIME (t) = SECOND (SEC)
TEMPERATURE (T) = CELSIUS (C)
TOFFSET = 273.0
CHARGE (Q) = COULOMB
FORCE (f) = NEWTON (N) (KG-M/SEC2)
HEAT = JOULE (N-M)
PRESSURE = PASCAL (NEWTON/M**2)
ENERGY (W) = JOULE (N-M)
POWER (P) = WATT (N-M/SEC)
CURRENT (i) = AMPERE (COULOMBS/SEC)
CAPACITANCE (C) = FARAD
INDUCTANCE (L) = HENRY
MAGNETIC FLUX = WEBER
RESISTANCE (R) = OHM
ELECTRIC POTENTIAL = VOLT
INPUT UNITS ARE ALSO SET TO MKS
*** MAPDL - ENGINEERING ANALYSIS SYSTEM RELEASE 2024 R2 24.2 ***
Ansys Mechanical Enterprise
00000000 VERSION=WINDOWS x64 18:02:09 JAN 08, 2025 CP= 0.438
example_02_Cooled_Turbine_Blade--Static Structural (C5)
***** MAPDL ANALYSIS DEFINITION (PREP7) *****
*********** Nodes for the whole assembly ***********
*********** Elements for Body 1 'SYS-3\Solid' ***********
*********** Send User Defined Coordinate System(s) ***********
*********** Set Reference Temperature ***********
*********** Send Materials ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Node Component ***********
*********** Send Named Selection as Element Component ***********
*********** Send Named Selection as Element Component ***********
*********** Fixed Supports ***********
***** ROUTINE COMPLETED ***** CP = 1.156
--- Number of total nodes = 37315
--- Number of contact elements = 0
--- Number of spring elements = 0
--- Number of bearing elements = 0
--- Number of solid elements = 29976
--- Number of condensed parts = 0
--- Number of total elements = 29736
*GET _WALLBSOL FROM ACTI ITEM=TIME WALL VALUE= 18.0358333
****************************************************************************
************************* SOLUTION ********************************
****************************************************************************
***** MAPDL SOLUTION ROUTINE *****
PERFORM A STATIC ANALYSIS
THIS WILL BE A NEW ANALYSIS
PARAMETER _THICKRATIO = 0.9090000000E-01
USE SPARSE MATRIX DIRECT SOLVER
CONTACT INFORMATION PRINTOUT LEVEL 1
CHECK INITIAL OPEN/CLOSED STATUS OF SELECTED CONTACT ELEMENTS
AND LIST DETAILED CONTACT PAIR INFORMATION
SPLIT CONTACT SURFACES AT SOLVE PHASE
NUMBER OF SPLITTING TBD BY PROGRAM
DO NOT COMBINE ELEMENT MATRIX FILES (.emat) AFTER DISTRIBUTED PARALLEL SOLUTION
DO NOT COMBINE ELEMENT SAVE DATA FILES (.esav) AFTER DISTRIBUTED PARALLEL SOLUTION
NLDIAG: Nonlinear diagnostics CONT option is set to ON.
Writing frequency : each ITERATION.
DO NOT SAVE ANY RESTART FILES AT ALL
****************************************************
******************* SOLVE FOR LS 1 OF 1 ****************
*********** Create Imported Load "Imported Body Temperature" ***********
PRINTOUT RESUMED BY /GOP
PRINTOUT RESUMED BY /GOP
USE 1 SUBSTEPS INITIALLY THIS LOAD STEP FOR ALL DEGREES OF FREEDOM
FOR AUTOMATIC TIME STEPPING:
USE 1 SUBSTEPS AS A MAXIMUM
USE 1 SUBSTEPS AS A MINIMUM
TIME= 1.0000
ERASE THE CURRENT DATABASE OUTPUT CONTROL TABLE.
WRITE ALL ITEMS TO THE DATABASE WITH A FREQUENCY OF NONE
FOR ALL APPLICABLE ENTITIES
WRITE NSOL ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
WRITE RSOL ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
WRITE EANG ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
WRITE ETMP ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
WRITE VENG ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
WRITE STRS ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
WRITE EPEL ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
WRITE EPPL ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
WRITE EPTH ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
WRITE CONT ITEMS TO THE DATABASE WITH A FREQUENCY OF ALL
FOR ALL APPLICABLE ENTITIES
*GET ANSINTER_ FROM ACTI ITEM=INT VALUE= 0.00000000
*IF ANSINTER_ ( = 0.00000 ) NE
0 ( = 0.00000 ) THEN
*ENDIF
*** NOTE *** CP = 1.453 TIME= 18:02:09
The automatic domain decomposition logic has selected the MESH domain
decomposition method with 4 processes per solution.
***** MAPDL SOLVE COMMAND *****
*** WARNING *** CP = 1.766 TIME= 18:02:10
Element shape checking is currently inactive. Issue SHPP,ON or
SHPP,WARN to reactivate, if desired.
*** WARNING *** CP = 1.891 TIME= 18:02:10
SOLID185 wedges are recommended only in regions of relatively low
stress gradients.
*** NOTE *** CP = 1.953 TIME= 18:02:10
The model data was checked and warning messages were found.
Please review output or errors file (
C:\Users\ansys\AppData\Local\Tempwbpj\_ProjectScratch\Scr5712\file0.err
r ) for these warning messages.
*** SELECTION OF ELEMENT TECHNOLOGIES FOR APPLICABLE ELEMENTS ***
--- GIVE SUGGESTIONS AND RESET THE KEY OPTIONS ---
ELEMENT TYPE 1 IS SOLID185. IT IS ASSOCIATED WITH LINEAR MATERIALS ONLY
AND POISSON'S RATIO IS NOT GREATER THAN 0.49. KEYOPT(2)=3 IS SUGGESTED AND
HAS BEEN RESET.
KEYOPT(1-12)= 0 3 0 0 0 0 0 0 0 0 0 0
*** MAPDL - ENGINEERING ANALYSIS SYSTEM RELEASE 2024 R2 24.2 ***
Ansys Mechanical Enterprise
00000000 VERSION=WINDOWS x64 18:02:10 JAN 08, 2025 CP= 2.000
example_02_Cooled_Turbine_Blade--Static Structural (C5)
S O L U T I O N O P T I O N S
PROBLEM DIMENSIONALITY. . . . . . . . . . . . .3-D
DEGREES OF FREEDOM. . . . . . UX UY UZ
ANALYSIS TYPE . . . . . . . . . . . . . . . . .STATIC (STEADY-STATE)
OFFSET TEMPERATURE FROM ABSOLUTE ZERO . . . . . 273.15
EQUATION SOLVER OPTION. . . . . . . . . . . . .SPARSE
GLOBALLY ASSEMBLED MATRIX . . . . . . . . . . .SYMMETRIC
*** NOTE *** CP = 2.062 TIME= 18:02:10
The conditions for direct assembly have been met. No .emat or .erot
files will be produced.
D I S T R I B U T E D D O M A I N D E C O M P O S E R
...Number of elements: 29736
...Number of nodes: 37315
...Decompose to 4 CPU domains
...Element load balance ratio = 1.000
L O A D S T E P O P T I O N S
LOAD STEP NUMBER. . . . . . . . . . . . . . . . 1
TIME AT END OF THE LOAD STEP. . . . . . . . . . 1.0000
NUMBER OF SUBSTEPS. . . . . . . . . . . . . . . 1
STEP CHANGE BOUNDARY CONDITIONS . . . . . . . . NO
PRINT OUTPUT CONTROLS . . . . . . . . . . . . .NO PRINTOUT
DATABASE OUTPUT CONTROLS
ITEM FREQUENCY COMPONENT
ALL NONE
NSOL ALL
RSOL ALL
EANG ALL
ETMP ALL
VENG ALL
STRS ALL
EPEL ALL
EPPL ALL
EPTH ALL
CONT ALL
SOLUTION MONITORING INFO IS WRITTEN TO FILE= file.mntr
Range of element maximum matrix coefficients in global coordinates
Maximum = 3.950515708E+09 at element 28705.
Minimum = 125678644 at element 5064.
*** ELEMENT MATRIX FORMULATION TIMES
TYPE NUMBER ENAME TOTAL CP AVE CP
1 29736 SOLID185 3.562 0.000120
Time at end of element matrix formulation CP = 4.015625.
DISTRIBUTED SPARSE MATRIX DIRECT SOLVER.
Number of equations = 101637, Maximum wavefront = 132
Memory allocated on only this MPI rank (rank 0)
-------------------------------------------------------------------
Equation solver memory allocated = 122.361 MB
Equation solver memory required for in-core mode = 117.315 MB
Equation solver memory required for out-of-core mode = 47.728 MB
Total (solver and non-solver) memory allocated = 670.936 MB
Total memory summed across all MPI ranks on this machines
-------------------------------------------------------------------
Equation solver memory allocated = 511.345 MB
Equation solver memory required for in-core mode = 490.030 MB
Equation solver memory required for out-of-core mode = 194.206 MB
Total (solver and non-solver) memory allocated = 1669.290 MB
*** NOTE *** CP = 4.406 TIME= 18:02:12
The Distributed Sparse Matrix Solver is currently running in the
in-core memory mode. This memory mode uses the most amount of memory
in order to avoid using the hard drive as much as possible, which most
often results in the fastest solution time. This mode is recommended
if enough physical memory is present to accommodate all of the solver
data.
curEqn= 27348 totEqn= 27348 Job CP sec= 4.953
Factor Done= 100% Factor Wall sec= 0.526 rate= 9.5 GFlops
Distributed sparse solver maximum pivot= 7.299354972E+09 at node 30840
UY.
Distributed sparse solver minimum pivot= 54855431.6 at node 36377 UZ.
Distributed sparse solver minimum pivot in absolute value= 54855431.6
at node 36377 UZ.
*** ELEMENT RESULT CALCULATION TIMES
TYPE NUMBER ENAME TOTAL CP AVE CP
1 29736 SOLID185 5.172 0.000174
*** NODAL LOAD CALCULATION TIMES
TYPE NUMBER ENAME TOTAL CP AVE CP
1 29736 SOLID185 0.562 0.000019
*** LOAD STEP 1 SUBSTEP 1 COMPLETED. CUM ITER = 1
*** TIME = 1.00000 TIME INC = 1.00000 NEW TRIANG MATRIX
*** MAPDL BINARY FILE STATISTICS
BUFFER SIZE USED= 16384
12.188 MB WRITTEN ON ASSEMBLED MATRIX FILE: file0.full
7.312 MB WRITTEN ON RESULTS FILE: file0.rst
*************** Write FE CONNECTORS *********
WRITE OUT CONSTRAINT EQUATIONS TO FILE= file.ce
****************************************************
*************** FINISHED SOLVE FOR LS 1 *************
*GET _WALLASOL FROM ACTI ITEM=TIME WALL VALUE= 18.0375000
PRINTOUT RESUMED BY /GOP
FINISH SOLUTION PROCESSING
***** ROUTINE COMPLETED ***** CP = 7.250
*** MAPDL - ENGINEERING ANALYSIS SYSTEM RELEASE 2024 R2 24.2 ***
Ansys Mechanical Enterprise
00000000 VERSION=WINDOWS x64 18:02:15 JAN 08, 2025 CP= 7.250
example_02_Cooled_Turbine_Blade--Static Structural (C5)
***** MAPDL RESULTS INTERPRETATION (POST1) *****
*** NOTE *** CP = 7.250 TIME= 18:02:15
Reading results into the database (SET command) will update the current
displacement and force boundary conditions in the database with the
values from the results file for that load set. Note that any
subsequent solutions will use these values unless action is taken to
either SAVE the current values or not overwrite them (/EXIT,NOSAVE).
Set Encoding of XML File to:ISO-8859-1
Set Output of XML File to:
PARM, , , , , , , , , , , ,
, , , , , , ,
DATABASE WRITTEN ON FILE parm.xml
EXIT THE MAPDL POST1 DATABASE PROCESSOR
***** ROUTINE COMPLETED ***** CP = 7.250
PRINTOUT RESUMED BY /GOP
*GET _WALLDONE FROM ACTI ITEM=TIME WALL VALUE= 18.0375000
PARAMETER _PREPTIME = 0.000000000
PARAMETER _SOLVTIME = 6.000000000
PARAMETER _POSTTIME = 0.000000000
PARAMETER _TOTALTIM = 6.000000000
*GET _DLBRATIO FROM ACTI ITEM=SOLU DLBR VALUE= 1.00000000
*GET _COMBTIME FROM ACTI ITEM=SOLU COMB VALUE= 0.198189700
*GET _SSMODE FROM ACTI ITEM=SOLU SSMM VALUE= 2.00000000
*GET _NDOFS FROM ACTI ITEM=SOLU NDOF VALUE= 101637.000
*GET _SOL_END_TIME FROM ACTI ITEM=SET TIME VALUE= 1.00000000
*IF _sol_end_time ( = 1.00000 ) EQ
1.000000 ( = 1.00000 ) THEN
/FCLEAN COMMAND REMOVING ALL LOCAL FILES
*ENDIF
--- Total number of nodes = 37315
--- Total number of elements = 29736
--- Element load balance ratio = 1
--- Time to combine distributed files = 0.1981897
--- Sparse memory mode = 2
--- Number of DOF = 101637
EXIT MAPDL WITHOUT SAVING DATABASE
NUMBER OF WARNING MESSAGES ENCOUNTERED= 2
NUMBER OF ERROR MESSAGES ENCOUNTERED= 0
+--------------------- M A P D L S T A T I S T I C S ------------------------+
Release: 2024 R2 Build: 24.2 Update: UP20240603 Platform: WINDOWS x64
Date Run: 01/08/2025 Time: 18:02 Process ID: 15684
Operating System: Windows 11 (Build: 22631)
Processor Model: Intel(R) Xeon(R) Platinum 8171M CPU @ 2.60GHz
Compiler: Intel(R) Fortran Compiler Classic Version 2021.9 (Build: 20230302)
Intel(R) C/C++ Compiler Classic Version 2021.9 (Build: 20230302)
Intel(R) oneAPI Math Kernel Library Version 2023.1-Product Build 20230303
Number of machines requested : 1
Total number of cores available : 8
Number of physical cores available : 4
Number of processes requested : 4
Number of threads per process requested : 1
Total number of cores requested : 4 (Distributed Memory Parallel)
MPI Type: INTELMPI
MPI Version: Intel(R) MPI Library 2021.11 for Windows* OS
GPU Acceleration: Not Requested
Job Name: file0
Input File: dummy.dat
Core Machine Name Working Directory
-----------------------------------------------------
0 pyworkbench C:\Users\ansys\AppData\Local\Tempwbpj\_ProjectScratch\Scr5712
1 pyworkbench C:\Users\ansys\AppData\Local\Tempwbpj\_ProjectScratch\Scr5712
2 pyworkbench C:\Users\ansys\AppData\Local\Tempwbpj\_ProjectScratch\Scr5712
3 pyworkbench C:\Users\ansys\AppData\Local\Tempwbpj\_ProjectScratch\Scr5712
Latency time from master to core 1 = 3.195 microseconds
Latency time from master to core 2 = 3.233 microseconds
Latency time from master to core 3 = 3.498 microseconds
Communication speed from master to core 1 = 5228.25 MB/sec
Communication speed from master to core 2 = 4826.31 MB/sec
Communication speed from master to core 3 = 4939.34 MB/sec
Total CPU time for main thread : 5.2 seconds
Total CPU time summed for all threads : 7.7 seconds
Elapsed time spent obtaining a license : 0.5 seconds
Elapsed time spent pre-processing model (/PREP7) : 0.3 seconds
Elapsed time spent solution - preprocessing : 0.4 seconds
Elapsed time spent computing solution : 4.4 seconds
Elapsed time spent solution - postprocessing : 0.2 seconds
Elapsed time spent post-processing model (/POST1) : 0.0 seconds
Equation solver used : Sparse (symmetric)
Equation solver computational rate : 44.4 Gflops
Equation solver effective I/O rate : 13.9 GB/sec
Sum of disk space used on all processes : 107.3 MB
Sum of memory used on all processes : 729.0 MB
Sum of memory allocated on all processes : 3246.0 MB
Physical memory available : 32 GB
Total amount of I/O written to disk : 0.1 GB
Total amount of I/O read from disk : 0.0 GB
+------------------ E N D M A P D L S T A T I S T I C S -------------------+
*-----------------------------------------------------------------------------*
| |
| RUN COMPLETED |
| |
|-----------------------------------------------------------------------------|
| |
| Ansys MAPDL 2024 R2 Build 24.2 UP20240603 WINDOWS x64 |
| |
|-----------------------------------------------------------------------------|
| |
| Database Requested(-db) 1024 MB Scratch Memory Requested 1024 MB |
| Max Database Used(Master) 39 MB Max Scratch Used(Master) 179 MB |
| Max Database Used(Workers) 1 MB Max Scratch Used(Workers) 172 MB |
| Sum Database Used(All) 42 MB Sum Scratch Used(All) 687 MB |
| |
|-----------------------------------------------------------------------------|
| |
| CP Time (sec) = 7.734 Time = 18:02:15 |
| Elapsed Time (sec) = 8.000 Date = 01/08/2025 |
| |
*-----------------------------------------------------------------------------*
Specify the Mechanical directory path for images and run a script to fetch the directory path. The path where images are stored on the server is printed. Download an image file (stress.png
) from the server to the client’s current working directory and display it using matplotlib
.
[17]:
from matplotlib import image as mpimg
from matplotlib import pyplot as plt
[18]:
mechanical.run_python_script(f"image_dir=ExtAPI.DataModel.AnalysisList[1].WorkingDir")
[18]:
''
[19]:
result_image_dir_server = mechanical.run_python_script(f"image_dir")
print(f"Images are stored on the server at: {result_image_dir_server}")
Images are stored on the server at: C:\Users\ansys\AppData\Local\Tempwbpj\example_02_Cooled_Turbine_Blade_files\dp0\SYS-8\MECH\
[20]:
def get_image_path(image_name):
return os.path.join(result_image_dir_server, image_name)
[21]:
def display_image(path):
print(f"Printing {path} using matplotlib")
image1 = mpimg.imread(path)
plt.figure(figsize=(15, 15))
plt.axis("off")
plt.imshow(image1)
plt.show()
[22]:
image_name = "stress.png"
image_path_server = get_image_path(image_name)
[23]:
if image_path_server != "":
current_working_directory = os.getcwd()
local_file_path_list = mechanical.download(
image_path_server, target_dir=current_working_directory
)
image_local_path = local_file_path_list[0]
print(f"Local image path : {image_local_path}")
display_image(image_local_path)
Downloading dns:///127.0.0.1:56164:C:\Users\ansys\AppData\Local\Tempwbpj\example_02_Cooled_Turbine_Blade_files\dp0\SYS-8\MECH\stress.png to C:\Users\ansys\actions-runner\_work\pyworkbench-examples\pyworkbench-examples\pyworkbench-examples\doc\source\examples\cooled-turbine-blade\stress.png: 100%|██████████| 11.6k/11.6k [00:00<?, ?B/s]
Local image path : C:\Users\ansys\actions-runner\_work\pyworkbench-examples\pyworkbench-examples\pyworkbench-examples\doc\source\examples\cooled-turbine-blade\stress.png
Printing C:\Users\ansys\actions-runner\_work\pyworkbench-examples\pyworkbench-examples\pyworkbench-examples\doc\source\examples\cooled-turbine-blade\stress.png using matplotlib
Download all the files from the server to the current working directory. Verify the target and source paths and copy all files from the server to the client.
[24]:
import shutil
import glob
[25]:
current_working_directory = os.getcwd()
target_dir2 = current_working_directory
print(f"Files to be copied from server path at: {target_dir2}")
Files to be copied from server path at: C:\Users\ansys\actions-runner\_work\pyworkbench-examples\pyworkbench-examples\pyworkbench-examples\doc\source\examples\cooled-turbine-blade
[26]:
print(f"All the solver file is stored on the server at: {result_solve_dir_server}")
All the solver file is stored on the server at: C:\Users\ansys\AppData\Local\Tempwbpj\example_02_Cooled_Turbine_Blade_files\dp0\SYS-8\MECH\
[27]:
source_dir = result_solve_dir_server
destination_dir = target_dir2
[28]:
for file in glob.glob(source_dir + '/*'):
shutil.copy(file, destination_dir)
Finally, the exit
method is called on both the PyMechanical and Workbench clients to gracefully shut down the services, ensuring that all resources are properly released.
[29]:
mechanical.exit()
wb.exit()