|
[Sponsors] |
October 6, 2019, 09:31 |
Degassing boundary condition requirements?
|
#1 |
New Member
Västerbotten
Join Date: Sep 2016
Posts: 23
Rep Power: 10 |
Hi. I'm doing a euler-euler packed bed reaction.
I want to have an open outlet for my primary phase, and the degassing boundary should by what I have read work perfectly for something as simple as that. The primary phase can pass, but the secondary cannot. However, I always have 0 mass flow across the boundary, even with a pressure gradient across said surface. It essentially seems to act like a wall, as it has no influence on the flow field of either phase. What specific settings do you need to apply for the degassing boundary condition to function? Within the gui there are no options, not even for applying UDFs. In the manual there is very little info. Can anyone help? |
|
November 2, 2019, 12:50 |
|
#2 |
New Member
Venkat
Join Date: Apr 2019
Location: Bengaluru
Posts: 7
Rep Power: 7 |
Hejsan!
With what I have worked, the secondary phase is allowed to pass through the outlet and the primary phase is retained within the domain. |
|
February 21, 2020, 04:45 |
|
#3 |
New Member
Västerbotten
Join Date: Sep 2016
Posts: 23
Rep Power: 10 |
Hejhej.
Thanks for the reply, but my issue is that when I run my simulation the primary phase passes through (as intended), and the secondary phase is retained (as intended), but after a while the secondary phase also passes through the degassing boundary. This eventually causes numerical instability no matter what I do and then everything crashes. I guess I'll need to try and make an udf degassing boundary, perhaps that works as the in-built one doesn't. |
|
February 21, 2020, 07:57 |
Phase Order
|
#4 |
Senior Member
|
One particular requirement of the degassing boundary condition is that liquid should be the primary phase. It appears that you have it other way around.
__________________
Regards, Vinerm PM to be used if and only if you do not want something to be shared publicly. PM is considered to be of the least priority. |
|
February 21, 2020, 08:14 |
|
#5 |
New Member
Västerbotten
Join Date: Sep 2016
Posts: 23
Rep Power: 10 |
Thanks! It's kinda odd that it did work for like 50k iterations with the gas as primary and fluid as secondary, but then again it did always crash so I'll try flipping them around. Again, thanks for the feedback!
|
|
February 21, 2020, 10:46 |
Implementation
|
#6 |
Senior Member
|
It will work but in a wrong way. Secondary phase is assumed to be gas in its implementation. So, if the condition is not met, it starts taking away the liquid. After a certain time, it becomes difficult for the code to extract liquid since it tries to stay at the bottom due to gravity and gas keeps on collecting because that's the only phase coming in with almost nothing going out of the domain. This leads to accumulation of the gas phase and causes divergence or instability.
__________________
Regards, Vinerm PM to be used if and only if you do not want something to be shared publicly. PM is considered to be of the least priority. |
|
February 21, 2020, 10:48 |
|
#7 |
New Member
Västerbotten
Join Date: Sep 2016
Posts: 23
Rep Power: 10 |
Again, thanks a lot, I had totally missed that quite important detail!
|
|
February 21, 2020, 16:22 |
|
#8 |
New Member
Västerbotten
Join Date: Sep 2016
Posts: 23
Rep Power: 10 |
But now I run into an immediate issue. I'm trying to do a Eulerian multiphase simulation, where one phase is hot gas, the other is a slow moving packed bed of rocks that is reacting with the gas flow (the rocks release carbon dioxide).
The main thing I'm trying to simulate is the properties of the rocks as they leave the reactor. Using the eulerian model, I can only set the secondary phase as a packed bed (the primary must be gas). And you are saying the degassing boundary only works if the secondary phase is gas? So... I'm back to degassing not working in eulerian flows? Why would the models be set up like that, so that they are incompatible? :/ |
|
February 21, 2020, 16:40 |
Degassing
|
#9 |
Senior Member
|
Degassing condition is meant to be used only with liquid-gas flows. Packed beds are granular and as such there is no free-board region. Furthermore, granular phases never occupy 100% volume, therefore, the gas or liquid phase is always interspersed. So, degassing is actually not applicable for your case. However, if you want to implement it with certain modifications, it is available as a UDF example within Fluent's UDF Manual. Essentially, this is a wall boundary condition that consumes gas and acts as a wall for liquid.
__________________
Regards, Vinerm PM to be used if and only if you do not want something to be shared publicly. PM is considered to be of the least priority. |
|
February 29, 2020, 06:54 |
|
#10 |
New Member
Västerbotten
Join Date: Sep 2016
Posts: 23
Rep Power: 10 |
After the good advice given, I changed my approach. I have managed to get all my inlets and outlet to have the correct massflow (either in our out), and my gas outlet 'almost' works. I think I have misunderstood something about how pointers and domains work here?
I have two phases, Gas with ID=2 and Solids with ID=3. If I do a Report -> Flux -> Mass flow, for Solids I get (kg/s) bottom -5.0926 canal 0 lansar 0 top 6.5585 This it as it should be. Part of the solids turn into gas and should leave through the canal in the gas phase instead. Horray. If I do a Report -> Flux -> Mass flow, for Gas I get (kg/s) bottom 3.9318 canal -12.7713 lansar 1.022 top 8.2667 All flows here are as they should except canal, whcih is the gas outlet, and which is 1.915 kg/s too low. Mass is accumulating in the system. The code I used is below. Code:
DEFINE_PROFILE(outlet_gas_totbal, t, i) { face_t f; real massflow7=0; real massflow9=0; real massflow11=0; real massflow7s=0; real massflow9s=0; real massflow_tot=0; Domain *gas; Domain *solids; Thread *t_sub; gas = Get_Domain(2); solids = Get_Domain(3); t_sub = Lookup_Thread(gas, 7); begin_f_loop(f,t_sub) /*Loop through all face threads on domain with id 7*/ { massflow7+=F_FLUX(f,t_sub); /* Assign mass flux at given face to variable massflow for each face thread*/ } end_f_loop(f,t_sub) t_sub = Lookup_Thread(gas, 9); begin_f_loop(f,t_sub) /*Loop through all face threads on domain with id 9*/ { massflow9+=F_FLUX(f,t_sub); /* Assign mass flux at given face to variable massflow for each face thread*/ } end_f_loop(f,t_sub) t_sub = Lookup_Thread(gas, 11); begin_f_loop(f,t_sub) /*Loop through all face threads on domain with id 11*/ { massflow11+=F_FLUX(f,t_sub); /* Assign mass flux at given face to variable massflow for each face thread*/ } end_f_loop(f,t_sub) t_sub = Lookup_Thread(solids, 7); begin_f_loop(f,t_sub) /*Loop through all face threads on domain with id 7*/ { massflow7s+=F_FLUX(f,t_sub); /* Assign mass flux at given face to variable massflow for each face thread*/ } end_f_loop(f,t_sub) t_sub = Lookup_Thread(solids, 9); begin_f_loop(f,t_sub) /*Loop through all face threads on domain with id 9*/ { massflow9s+=F_FLUX(f,t_sub); /* Assign mass flux at given face to variable massflow for each face thread*/ } end_f_loop(f,t_sub) massflow7 = PRF_GRSUM1(massflow7); /*Sums the values of the cores into a single value*/ massflow9 = PRF_GRSUM1(massflow9); /*Sums the values of the cores into a single value*/ massflow11 = PRF_GRSUM1(massflow11); /*Sums the values of the cores into a single value*/ massflow7s = PRF_GRSUM1(massflow7s); /*Sums the values of the cores into a single value*/ massflow9s = PRF_GRSUM1(massflow9s); /*Sums the values of the cores into a single value*/ massflow_tot = massflow7+massflow9+massflow11+massflow7s+massflow9s; Message ("botten_g %g \n", 2*M_PI*massflow7); Message ("top_g %g \n", 2*M_PI*massflow9); Message ("lans %g \n", 2*M_PI*massflow11); Message ("botten_s %g \n", 2*M_PI*massflow7s); Message ("top_s %g \n", 2*M_PI*massflow9s); Message ("tot %g \n", 2*M_PI*massflow_tot); t_sub = Lookup_Thread(gas, 5); begin_f_loop(f,t_sub) { F_FLUX(f,t_sub)=-massflow_tot/(2*M_PI); /* Assign mass flux out based on total flow in*/ } end_f_loop(f,t_sub) } botten_g 0 top_g -6.55854 lans 0 botten_s 5.0926 top_s -6.55847 tot -8.02442 Note that the numbers for top_g, top_s and the reported flux of solids at the top are all about 6.5585 kg/s (though not identical). The faces that end with "_s" (solids, domain 3) seem to be read accurately, but the faces that end with "_g" (domain 2) are not read properly at all. The udf does not find any flux of gas at "botten_g" or "lans", even though the flux report tool does. So my question is, why does the code work for my domain 3 (solids), with ID=3, but does not find domain 2 (gas) with ID=2? I also tried pointing to the phases by Code:
d = Get_Domain(1); gas = DOMAIN_SUB_DOMAIN(d,0); solids = DOMAIN_SUB_DOMAIN(d,1); What confuses me the most is that in the final part of the udf I try to set a new mass flux of the gas phase in the canal (domain 2), and that works fine. It's just the reading of mass fluxes on domain 2 that do not work, and I don't understand why? Best regards, hope someone can set me straight here? |
|
February 29, 2020, 07:26 |
Get_Domain and Sub_Domain
|
#11 |
Senior Member
|
If you want to use Get_Domain to fetch domains for secondary or primary phases, please check their IDs in the multiphase panel. The panel where phases are listed shows IDs. If gas has ID 2, then it is correct, otherwise change it.
If you want to use DOMAIN_SUB_DOMAIN, then you have to start from 1. 0 identifies primary phase. For secondary phases, it starts from 1.
__________________
Regards, Vinerm PM to be used if and only if you do not want something to be shared publicly. PM is considered to be of the least priority. |
|
February 29, 2020, 07:51 |
|
#12 |
New Member
Västerbotten
Join Date: Sep 2016
Posts: 23
Rep Power: 10 |
Thanks! Yes, gas has ID=2, solids ID=3. And as said, I have no problems reading the faces for the solids (ID=3).
I tried changing the subdomains from 0 and 1 to 1 and 2, but then got an immediate crash as there is no secondary phase at 2... But, I noticed something when I restared fluent! I recomplied my code before loading my simulation, and the console spat out botten_g -3.93179 top_g -8.2666 lans -1.022 botten_s 5.0926 top_s -6.56012 tot -14.6879 Which is the numbers the udf should give! But when I run it I get 0 on all the gas faces (the three first numbers). So I wanted to know the zone ID the function thinks I'm pointing at. Code:
t_sub = Lookup_Thread(gas, 7); zone_ID7 = THREAD_ID(t_sub); begin_f_loop(f,t_sub) /*Loop through all face threads on domain with id 7*/ { massflow7+=F_FLUX(f,t_sub); /* Assign mass flux at given face to variable massflow for each face thread*/ } end_f_loop(f,t_sub) Printing zone IDXX gives ID7 7 ID9 9 ID11 11 And the gas flow at ID 9 is set to the same as the solid flow there (both -6.56011 kg/s). But this does not go for boundary 7, which also has a gas and solid flow (gas= 0 and solids = 6.56). I just don't see the system? Last edited by Per_Holmgren; February 29, 2020 at 09:34. |
|
February 29, 2020, 10:03 |
Suggestions
|
#13 |
Senior Member
|
In my view as well it should work, however, you are doing it differently than it should be done. First, you are using DEFINE_PROFILE but not actually making use of it since you are not applying a profile. So, this should be done using a general purpose macro, such as, DEFINE_ADJUST or DEFINE_EXECUTE_AT_END. If you use DEFINE_PROFILE, then you might be hooking it to a particular cell or boundary zone. Since you are not applying any profile, the value being applied might be 0. So, I'd suggest you do it in DEFINE_EXECUTE_AT_END.
Second and more important point is that you have multiple face loops that go over the same faces again and again. As far as mesh is concerned, it belongs to mixture domain and not phases. Therefore, in a begin_f_loop, second argument should always be mixture thread and not phase thread. Use the phase thread only to access the variables, e.g., Code:
Thread *tf = Lookup_Thread(Get_Domain(1), 7); Thread **pt = THREAD_SUB_THREADS(tf); face_t f; begin_f_loop(f, tf) { massflow7 = F_FLUX(f, pt[0]); /* Mass flux for primary phase */ massflow7g = F_FLUX(f, pt[1]); /* Mass flux for first secondary phase */ massflow7s = F_FLUX(f, pt[2]); /* Mass flux for second secondary phase */ } end_f_loop(f, tf) }
__________________
Regards, Vinerm PM to be used if and only if you do not want something to be shared publicly. PM is considered to be of the least priority. |
|
March 1, 2020, 06:23 |
|
#14 |
New Member
Västerbotten
Join Date: Sep 2016
Posts: 23
Rep Power: 10 |
Thanks again, I agree the code looked pretty bad as it was. It was sort of devolving as time went along and I tried more and more things to get it work. It started as a define_profile function, but it has evolved quite a bit. I also worked with sub_domains to find values
The new function looks like this: Code:
DEFINE_EXECUTE_AT_END(outlet_gas_totbal_end) { face_t f; real massflow7=0; real massflow9=0; real massflow11=0; real massflow7s=0; real massflow9s=0; real massflow_tot=0; Domain *d = Get_Domain(1); Thread *t_7 = Lookup_Thread(d, 7); Thread *t_9 = Lookup_Thread(d, 9); Thread *t_11 = Lookup_Thread(d, 11); Thread *t_5 = Lookup_Thread(d, 5); Thread **pt7 = THREAD_SUB_THREADS(t_7); Thread **pt9 = THREAD_SUB_THREADS(t_9); Thread **pt11 = THREAD_SUB_THREADS(t_11); Thread **pt5 = THREAD_SUB_THREADS(t_5); begin_f_loop(f,t_7) /*Loop through face threads with id 7 on domain */ { massflow7+=F_FLUX(f,pt7[0]); /* Read and store mass flux of primary phase */ massflow7s+=F_FLUX(f,pt7[1]); /* Read and store mass flux of secondary phase */ } end_f_loop(f,t_7) begin_f_loop(f,t_9) /*Loop through face threads with id 9 on domain */ { massflow9+=F_FLUX(f,pt9[0]); /* Read and store mass flux of primary phase */ massflow9s+=F_FLUX(f,pt9[1]); /* Read and store mass flux of secondary phase */ } end_f_loop(f,t_9) begin_f_loop(f,t_11) /*Loop through face threads with id 11 on domain */ { massflow11+=F_FLUX(f,pt11[0]); /* Read and store mass flux of primary phase */ } end_f_loop(f,t_11) massflow7 = PRF_GRSUM1(massflow7); /*Sums the values of the cores into a single value*/ massflow9 = PRF_GRSUM1(massflow9); /*Sums the values of the cores into a single value*/ massflow11 = PRF_GRSUM1(massflow11); /*Sums the values of the cores into a single value*/ massflow7s = PRF_GRSUM1(massflow7s); /*Sums the values of the cores into a single value*/ massflow9s = PRF_GRSUM1(massflow9s); /*Sums the values of the cores into a single value*/ massflow_tot = massflow7+massflow9+massflow11+massflow7s+massflow9s; /* calcuate mass balance */ Message ("botten_g %g \n", 2*M_PI*massflow7); /*these are just for debugging ofc*/ Message ("top_g %g \n", 2*M_PI*massflow9); Message ("lans %g \n", 2*M_PI*massflow11); Message ("botten_s %g \n", 2*M_PI*massflow7s); Message ("top_s %g \n", 2*M_PI*massflow9s); Message ("tot %g \n", 2*M_PI*massflow_tot); begin_f_loop(f,t_5) /*Sets the flux on gas outlet canal (id 5) */ { F_FLUX(f,pt5[0])=-massflow_tot/(2*M_PI); /* Assign mass flux of primary (gas) phase*/ F_FLUX(f,pt5[1])=0; /* Assign mass flux of secondary phase, which should be zero*/ } end_f_loop(f,t_5) } It does set the canal outflow successfully, just using the wrong flow due to previous errors. So to zoom in on ID 9. Code:
begin_f_loop(f,t_9) /*Loop through face threads on domain with id 9*/ { massflow9+=F_FLUX(f,pt9[0]); /* Read and store mass flux of primary phase */ massflow9s+=F_FLUX(f,pt9[1]); /* Read and store mass flux of secondary phase */ } However, "massflow9" is instead read as identical to "massflow9s", even though it is pointing to the primary phase (pt9[0]) instead of secondary (pt9[1]). And ID 7 and ID 11 are read as zero on all phases. The flux report tool as usual tells me that there is massflux on all these faces, but the code can't find it (except the one I mentioned above). A detail I noticed though. When I run my simulation, stop, and then open the mass flux Report tool, all flows except at ID 5 (canal) and ID 9 (top) are initially reported as zero. This is for the mixture domain. The flux at the top (ID9) is simply reported as the flux of the secondary phase only. These numbers matches the numbers my UDF finds, and I need to click calculate to get the correct flow at the different faces. I'm starting to think I need to reinstall fluent, because the code *should* work.... |
|
March 1, 2020, 06:37 |
Initial Values
|
#15 |
Senior Member
|
The initial values depend upon the initialization. But if the code is reporting 0 values even after the calculation, then there is something wrong.
One more thing, if the objective of the code is only to report flows, then rather use the monitors. But if you want to do something more with this, then yes, code might be needed. But your last loop appears to be strange. Are you trying to apply the flux to a boundary? If yes, then it won't work this way. F_FLUX cannot be used to apply a profile, only to fetch. To apply, you have to use DEFINE_PROFILE, as you were using earlier, and use F_PROFILE wherever you want to apply it. To fetch, you still use F_FLUX. I suppose this should resolve the issue that you are facing. Furthermore, you do not require PRF_GRSUM1 if you are using it in Serial. If you are using it in parallel, then the rest of the code requires parallelization.
__________________
Regards, Vinerm PM to be used if and only if you do not want something to be shared publicly. PM is considered to be of the least priority. |
|
March 1, 2020, 07:15 |
|
#16 |
New Member
Västerbotten
Join Date: Sep 2016
Posts: 23
Rep Power: 10 |
I'm simply trying to write code that sets the mass flow out of a given boundary equal to the difference between the other mass flow in/outlets.
Using F_FLUX to apply the value I'm after works fine though? Both in DEFINE_PROFILE version and my DEFINE_EXECUTE_AT_END version? In the code I can set the value of what goes in to F_FLUX manually and it sets the mass flux correctly at the given boundary. But I think the issue is that I have not properly adapted my code to parallelization. I use 8 cores and thought PRF_GRSUM1 would suffice for what I'm trying to do. The manual says it reads through the array for each variable and sums it into one value, which in my mind should allow me to fetch the numbers I'm after no matter what core they are computed on. But It seems like I need to do a bit more than that. I guess? If print my messages before I use PRF_GRSUM1 I get 8 values for each face (due to using 8 cores). For example. botten_g 0 botten_g 0 botten_g 0 botten_g 0 botten_g 0 botten_g 0 botten_g 0 botten_g 0 top_g 0 top_g 0 top_g 0 top_g 0 top_g 0 top_g 0 top_g -6.56012 top_g 0 lans 0 lans 0 lans 0 lans 0 lans 0 lans 0 lans 0 lans 0 botten_s 0 botten_s 0 botten_s 0 botten_s 0 botten_s 0 botten_s 5.0926 botten_s 0 botten_s 0 top_s 0 top_s 0 top_s 0 top_s 0 top_s 0 top_s 0 top_s -6.55991 top_s 0 What I'm doing with PRF_GRSUM1 is just summing these, so all the zeros are ignored as it doesn't matter what node they are on. But what I don't understand is why for example "botten_g" is all zeroes? I'll see if I can make it more compatible with parallel processing, as that simply has to be the problem. But I don't see why this doesn't work. |
|
March 1, 2020, 11:32 |
Parallelization
|
#17 |
Senior Member
|
Though the code certainly requires parallelization yet to check if it is the parallelization issue, try the code in serial. If it works as expected, then the effect is due to the non-parallelization. Else the issue is somewhere else.
__________________
Regards, Vinerm PM to be used if and only if you do not want something to be shared publicly. PM is considered to be of the least priority. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Wind turbine simulation | Saturn | CFX | 60 | July 17, 2024 06:45 |
Cyclic boundary condition in foam-extend 4.0 | rellumeister | OpenFOAM Pre-Processing | 2 | March 3, 2020 09:03 |
Radiation interface | hinca | CFX | 15 | January 26, 2014 18:11 |
External Radiation Boundary Condition (Two sided wall), Grid Interface | CFD XUE | FLUENT | 0 | July 8, 2010 07:49 |
About degassing boundary condition | superqtp | FLUENT | 0 | September 8, 2009 06:24 |