1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295
|
/**************************************************************************/
/* */
/* OCaml */
/* */
/* Xavier Leroy, projet Gallium, INRIA Rocquencourt */
/* Tom Kelly, OCaml Labs Consultancy, UK */
/* */
/* Copyright 2013 Institut National de Recherche en Informatique et */
/* en Automatique. */
/* Copyright 2022 OCaml Labs Consultancy Ltd. */
/* */
/* All rights reserved. This file is distributed under the terms of */
/* the GNU Lesser General Public License version 2.1, with the */
/* special exception on linking described in the file LICENSE. */
/* */
/**************************************************************************/
/* Asm part of the runtime system, ARM processor, 64-bit mode */
/* Must be preprocessed by cpp */
#include "caml/m.h"
#include "caml/asm.h"
/* Special registers */
#define DOMAIN_STATE_PTR x28
#define TRAP_PTR x26
#define ALLOC_PTR x27
#define ADDITIONAL_ARG x8
#define STACK_ARG_BEGIN x20
#define STACK_ARG_END x21
#define TMP x16
#define TMP2 x17
#define C_ARG_1 x0
#define C_ARG_2 x1
#define C_ARG_3 x2
#define C_ARG_4 x3
/* DWARF
These ARM64 specific register numbers are coming from
Table 4 ("Mapping from DWARF register numbers to Arm
64-bit architecture registers") of:
https://developer.arm.com/documentation/ihi0057/latest
*/
#define DW_REG_x21 21
#define DW_REG_x29 29
#define DW_REG_sp 31
.set domain_curr_field, 0
#if defined(SYS_macosx)
#define DOMAIN_STATE(c_type, name) DOMAIN_STATE c_type, name
.macro DOMAIN_STATE c_type, name
.equ domain_field_caml_\name, domain_curr_field
.set domain_curr_field, domain_curr_field + 1
.endm
#else
#define DOMAIN_STATE(c_type, name) \
.equ domain_field_caml_##name, domain_curr_field ; \
.set domain_curr_field, domain_curr_field + 1
#endif
#include "../runtime/caml/domain_state.tbl"
#undef DOMAIN_STATE
#define Caml_state(var) [DOMAIN_STATE_PTR, 8*domain_field_caml_##var]
/* Globals and labels */
#if defined(SYS_macosx)
#define CAMLSEP(x,y) caml_##x##$##y
#define G(sym) _##sym
#define L(lbl) L##lbl
#else
#define CAMLSEP(x,y) caml_##x##.##y
#define G(sym) sym
#define L(lbl) .L##lbl
#endif
#if defined(SYS_macosx)
#define ADDRGLOBAL(reg,symb) ADDRGLOBAL reg, symb
.macro ADDRGLOBAL reg, symb
adrp TMP2, G(\symb)@GOTPAGE
ldr \reg, [TMP2, G(\symb)@GOTPAGEOFF]
.endm
#elif defined(__PIC__)
#define ADDRGLOBAL(reg,symb) \
adrp TMP2, :got:G(symb); \
ldr reg, [TMP2, #:got_lo12:G(symb)]
#else
#define ADDRGLOBAL(reg,symb) \
adrp reg, G(symb); \
add reg, reg, #:lo12:G(symb)
#endif
#if defined(FUNCTION_SECTIONS)
#define TEXT_SECTION(name) .section .text.caml.##name,"ax",%progbits
#else
#define TEXT_SECTION(name)
#endif
#if defined(FUNCTION_SECTIONS)
TEXT_SECTION(CAMLSEP(hot,code_begin))
.globl G(CAMLSEP(hot,code_begin))
G(CAMLSEP(hot,code_begin)):
TEXT_SECTION(CAMLSEP(hot,code_end))
.globl G(CAMLSEP(hot,code_end))
G(CAMLSEP(hot,code_end)):
#endif
#if defined(SYS_macosx)
#define FUNCTION(name) FUNCTION name
.macro FUNCTION name
TEXT_SECTION(G(\name))
.align 2
.globl G(\name)
G(\name):
.endm
#define END_FUNCTION(name)
#define OBJECT(name) OBJECT name
.macro OBJECT name
.data
.align 3
.globl G(\name)
G(\name):
.endm
#define END_OBJECT(name)
#else
#define FUNCTION(name) \
TEXT_SECTION(name); \
.align 2; \
.globl G(name); \
TYPE_DIRECTIVE(G(name),%function) \
G(name):
#define END_FUNCTION(name) \
SIZE_DIRECTIVE(G(name))
#define OBJECT(name) \
.data; \
.align 3; \
.globl G(name); \
TYPE_DIRECTIVE(G(name),%object) \
G(name):
#define END_OBJECT(name) \
SIZE_DIRECTIVE(G(name))
#endif
/* Function prologue and epilogue */
.macro ENTER_FUNCTION
CFI_OFFSET(29, -16)
CFI_OFFSET(30, -8)
stp x29, x30, [sp, -16]!
CFI_ADJUST(16)
add x29, sp, #0
.endm
.macro LEAVE_FUNCTION
ldp x29, x30, [sp], 16
CFI_ADJUST(-16)
.endm
/* Stack switching operations */
/* struct stack_info */
#define Stack_sp(reg) [reg]
#define Stack_sp_offset 0
#define Stack_exception(reg) [reg, #8]
#define Stack_handler(reg) [reg, #16]
#define Stack_handler_from_cont(reg) [reg, #15]
/* struct c_stack_link */
#define Cstack_stack(reg) [reg]
#define Cstack_sp(reg) [reg, #8]
#define Cstack_sp_offset 8
#define Cstack_prev(reg) [reg, #16]
/* struct stack_handler */
#define Handler_value(reg) [reg]
#define Handler_exception(reg) [reg, #8]
#define Handler_effect(reg) [reg, #16]
#define Handler_parent(reg) [reg, #24]
#define Handler_parent_offset 24
/* Switch from OCaml to C stack.
If a C function is called which might call back into OCaml,
then nothing may be pushed to the C stack between SWITCH_OCAML_TO_C
and the next C call. (This is to ensure frame pointers are correctly
maintained if the stack is reallocated) */
.macro SWITCH_OCAML_TO_C
/* Fill in Caml_state->current_stack->sp */
ldr TMP, Caml_state(current_stack)
mov TMP2, sp
str TMP2, Stack_sp(TMP)
/* Fill in Caml_state->c_stack */
ldr TMP2, Caml_state(c_stack)
str TMP, Cstack_stack(TMP2)
mov TMP, sp
str TMP, Cstack_sp(TMP2)
/* Switch to C stack */
mov sp, TMP2
#ifdef ASM_CFI_SUPPORTED
CFI_REMEMBER_STATE
/* sp points to the c_stack_link. */
.cfi_escape DW_CFA_def_cfa_expression, 5, \
DW_OP_breg + DW_REG_sp, Cstack_sp_offset, DW_OP_deref, \
DW_OP_plus_uconst, 16 /* fp + retaddr */
#endif
.endm
/* Switch from C to OCaml stack. */
.macro SWITCH_C_TO_OCAML
ldr TMP, Cstack_sp(sp)
mov sp, TMP
CFI_RESTORE_STATE
.endm
/* Save all of the registers that may be in use to a free gc_regs bucket
and store ALLOC_PTR and TRAP_PTR back to Caml_state
At the end the saved registers are placed in Caml_state(gc_regs)
*/
.macro SAVE_ALL_REGS
/* First, save the young_ptr & exn_handler */
str ALLOC_PTR, Caml_state(young_ptr)
str TRAP_PTR, Caml_state(exn_handler)
/* Now, use TMP to point to the gc_regs bucket */
ldr TMP, Caml_state(gc_regs_buckets)
ldr TMP2, [TMP, 0] /* next ptr */
str TMP2, Caml_state(gc_regs_buckets)
/* Save allocatable registers */
stp x0, x1, [TMP, 16]
stp x2, x3, [TMP, 32]
stp x4, x5, [TMP, 48]
stp x6, x7, [TMP, 64]
stp x8, x9, [TMP, 80]
stp x10, x11, [TMP, 96]
stp x12, x13, [TMP, 112]
stp x14, x15, [TMP, 128]
stp x19, x20, [TMP, 144]
stp x21, x22, [TMP, 160]
stp x23, x24, [TMP, 176]
str x25, [TMP, 192]
/* Save caller-save floating-point registers
(callee-saves are preserved by C functions) */
stp d0, d1, [TMP, 208]
stp d2, d3, [TMP, 224]
stp d4, d5, [TMP, 240]
stp d6, d7, [TMP, 256]
stp d16, d17, [TMP, 272]
stp d18, d19, [TMP, 288]
stp d20, d21, [TMP, 304]
stp d22, d23, [TMP, 320]
stp d24, d25, [TMP, 336]
stp d26, d27, [TMP, 352]
stp d28, d29, [TMP, 368]
stp d30, d31, [TMP, 384]
add TMP, TMP, #16
str TMP, Caml_state(gc_regs)
.endm
/* Undo SAVE_ALL_REGS by loading the registers saved in Caml_state(gc_regs)
and refreshing ALLOC_PTR & TRAP_PTR from Caml_state */
.macro RESTORE_ALL_REGS
/* Restore x0, x1, freeing up the next ptr slot */
ldr TMP, Caml_state(gc_regs)
sub TMP, TMP, #16
/* Restore registers */
ldp x0, x1, [TMP, 16]
ldp x2, x3, [TMP, 32]
ldp x4, x5, [TMP, 48]
ldp x6, x7, [TMP, 64]
ldp x8, x9, [TMP, 80]
ldp x10, x11, [TMP, 96]
ldp x12, x13, [TMP, 112]
ldp x14, x15, [TMP, 128]
ldp x19, x20, [TMP, 144]
ldp x21, x22, [TMP, 160]
ldp x23, x24, [TMP, 176]
ldr x25, [TMP, 192]
ldp d0, d1, [TMP, 208]
ldp d2, d3, [TMP, 224]
ldp d4, d5, [TMP, 240]
ldp d6, d7, [TMP, 256]
ldp d16, d17, [TMP, 272]
ldp d18, d19, [TMP, 288]
ldp d20, d21, [TMP, 304]
ldp d22, d23, [TMP, 320]
ldp d24, d25, [TMP, 336]
ldp d26, d27, [TMP, 352]
ldp d28, d29, [TMP, 368]
ldp d30, d31, [TMP, 384]
/* Put gc_regs struct back in bucket linked list */
ldr TMP2, Caml_state(gc_regs_buckets)
str TMP2, [TMP, 0] /* next ptr */
str TMP, Caml_state(gc_regs_buckets)
/* Reload new allocation pointer & exn handler */
ldr ALLOC_PTR, Caml_state(young_ptr)
ldr TRAP_PTR, Caml_state(exn_handler)
.endm
/* Updates the oldest saved frame pointer in the target fiber.
A fiber stack may need to grow, causing the reallocation of the entire fiber,
including stack_info and stack_handler structures.
caml_try_realloc_stack will not be able to update the linked list of
frame-pointers if it has been split (i.e., in a continuation).
caml_resume and caml_reperform use this macro to update the oldest saved x29
(highest one in the stack) in case the fiber was reallocated to reattach the
frame-pointer linked list.
REG: Stack_handler(target_fiber)
The frame pointer will be pushed into the stack immediately after these
instructions. The offset of the oldest saved x29 in a fiber from the stack
handler is 48 = 4 words (caml_runstack) + 2 words (x30 and x29).
*/
#ifdef WITH_FRAME_POINTERS
.macro UPDATE_BASE_POINTER reg
sub TMP2, sp, 16
str TMP2, [\reg, -48]
.endm
#else
.macro UPDATE_BASE_POINTER reg
.endm
#endif
#if defined(WITH_THREAD_SANITIZER) /* { */
/* Push the current value of the link register to the stack. */
.macro TSAN_SETUP_C_CALL
CFI_OFFSET(30, -16)
str x30, [sp, -16]!
CFI_ADJUST(16)
.endm
/* Restore the value of the link register from the stack. */
.macro TSAN_CLEANUP_AFTER_C_CALL
ldr x30, [sp], 16
CFI_ADJUST(-16)
.endm
/* Invoke a C function, switching back and forth the OCaml and C stacks. */
.macro TSAN_C_CALL fun
SWITCH_OCAML_TO_C
TSAN_SETUP_C_CALL
bl \fun
TSAN_CLEANUP_AFTER_C_CALL
SWITCH_C_TO_OCAML
.endm
/* Invoke __tsan_func_entry(return address in the caller) */
.macro TSAN_ENTER_FUNCTION
mov x0, x30 /* arg1: return address in caller */
TSAN_C_CALL G(__tsan_func_entry)
.endm
/* Invoke __tsan_func_exit(0) */
.macro TSAN_EXIT_FUNCTION
mov x0, xzr
TSAN_C_CALL G(__tsan_func_exit)
.endm
/* This is similar to SAVE_ALL_REGS, but only saving the caller-saved
registers. */
.macro TSAN_SAVE_CALLER_REGS
/* First, save the young_ptr & exn_handler */
str ALLOC_PTR, Caml_state(young_ptr)
str TRAP_PTR, Caml_state(exn_handler)
/* Now, use TMP to point to the gc_regs bucket */
ldr TMP, Caml_state(gc_regs_buckets)
ldr TMP2, [TMP, 0] /* next ptr */
str TMP2, Caml_state(gc_regs_buckets)
/* Save caller-saved registers */
stp x0, x1, [TMP, 16]
stp x2, x3, [TMP, 32]
stp x4, x5, [TMP, 48]
stp x6, x7, [TMP, 64]
stp x8, x9, [TMP, 80]
stp x10, x11, [TMP, 96]
stp x12, x13, [TMP, 112]
stp x14, x15, [TMP, 128]
/* Save caller-save floating-point registers */
stp d0, d1, [TMP, 208]
stp d2, d3, [TMP, 224]
stp d4, d5, [TMP, 240]
stp d6, d7, [TMP, 256]
stp d16, d17, [TMP, 272]
stp d18, d19, [TMP, 288]
stp d20, d21, [TMP, 304]
stp d22, d23, [TMP, 320]
stp d24, d25, [TMP, 336]
stp d26, d27, [TMP, 352]
stp d28, d29, [TMP, 368]
stp d30, d31, [TMP, 384]
add TMP, TMP, #16
str TMP, Caml_state(gc_regs)
.endm
/* This is similar to RESTORE_ALL_REGS, but only restoring the caller-saved
registers. */
.macro TSAN_RESTORE_CALLER_REGS
/* Restore x0, x1, freeing up the next ptr slot */
ldr TMP, Caml_state(gc_regs)
sub TMP, TMP, #16
/* Restore registers */
ldp x0, x1, [TMP, 16]
ldp x2, x3, [TMP, 32]
ldp x4, x5, [TMP, 48]
ldp x6, x7, [TMP, 64]
ldp x8, x9, [TMP, 80]
ldp x10, x11, [TMP, 96]
ldp x12, x13, [TMP, 112]
ldp x14, x15, [TMP, 128]
ldp d0, d1, [TMP, 208]
ldp d2, d3, [TMP, 224]
ldp d4, d5, [TMP, 240]
ldp d6, d7, [TMP, 256]
ldp d16, d17, [TMP, 272]
ldp d18, d19, [TMP, 288]
ldp d20, d21, [TMP, 304]
ldp d22, d23, [TMP, 320]
ldp d24, d25, [TMP, 336]
ldp d26, d27, [TMP, 352]
ldp d28, d29, [TMP, 368]
ldp d30, d31, [TMP, 384]
/* Put gc_regs struct back in bucket linked list */
ldr TMP2, Caml_state(gc_regs_buckets)
str TMP2, [TMP, 0] /* next ptr */
str TMP, Caml_state(gc_regs_buckets)
/* Reload new allocation pointer & exn handler */
ldr ALLOC_PTR, Caml_state(young_ptr)
ldr TRAP_PTR, Caml_state(exn_handler)
.endm
#else /* } { */
.macro TSAN_ENTER_FUNCTION
.endm
.macro TSAN_EXIT_FUNCTION
.endm
.macro TSAN_SAVE_CALLER_REGS
.endm
.macro TSAN_RESTORE_CALLER_REGS
.endm
#endif /* } WITH_THREAD_SANITIZER */
/* Allocation functions and GC interface.
Referenced from C code in runtime/startup_nat.c
*/
TEXT_SECTION(caml_system__code_begin)
.globl G(caml_system__code_begin)
G(caml_system__code_begin):
FUNCTION(caml_call_realloc_stack)
CFI_STARTPROC
CFI_SIGNAL_FRAME
/* Save return address and frame pointer */
ENTER_FUNCTION
/* Save all registers (including ALLOC_PTR & TRAP_PTR) */
SAVE_ALL_REGS
ldr C_ARG_1, [sp, 16] /* argument */
SWITCH_OCAML_TO_C
bl G(caml_try_realloc_stack)
SWITCH_C_TO_OCAML
cbz x0, 1f
RESTORE_ALL_REGS
/* Free stack space and return to caller */
LEAVE_FUNCTION
ret
1: RESTORE_ALL_REGS
/* Raise the Stack_overflow exception */
LEAVE_FUNCTION
add sp, sp, 16 /* pop argument */
ADDRGLOBAL(x0, caml_exn_Stack_overflow)
b G(caml_raise_exn)
CFI_ENDPROC
END_FUNCTION(caml_call_realloc_stack)
FUNCTION(caml_call_gc)
CFI_STARTPROC
L(caml_call_gc):
CFI_SIGNAL_FRAME
/* Save return address and frame pointer */
ENTER_FUNCTION
/* Store all registers (including ALLOC_PTR & TRAP_PTR) */
SAVE_ALL_REGS
TSAN_ENTER_FUNCTION
SWITCH_OCAML_TO_C
/* Call the garbage collector */
bl G(caml_garbage_collection)
SWITCH_C_TO_OCAML
TSAN_EXIT_FUNCTION
RESTORE_ALL_REGS
/* Free stack space and return to caller */
LEAVE_FUNCTION
ret
CFI_ENDPROC
END_FUNCTION(caml_call_gc)
FUNCTION(caml_alloc1)
CFI_STARTPROC
ldr TMP, Caml_state(young_limit)
sub ALLOC_PTR, ALLOC_PTR, #16
cmp ALLOC_PTR, TMP
b.lo L(caml_call_gc)
ret
CFI_ENDPROC
END_FUNCTION(caml_alloc1)
FUNCTION(caml_alloc2)
CFI_STARTPROC
ldr TMP, Caml_state(young_limit)
sub ALLOC_PTR, ALLOC_PTR, #24
cmp ALLOC_PTR, TMP
b.lo L(caml_call_gc)
ret
CFI_ENDPROC
END_FUNCTION(caml_alloc2)
FUNCTION(caml_alloc3)
CFI_STARTPROC
ldr TMP, Caml_state(young_limit)
sub ALLOC_PTR, ALLOC_PTR, #32
cmp ALLOC_PTR, TMP
b.lo L(caml_call_gc)
ret
CFI_ENDPROC
END_FUNCTION(caml_alloc3)
FUNCTION(caml_allocN)
CFI_STARTPROC
ldr TMP, Caml_state(young_limit)
sub ALLOC_PTR, ALLOC_PTR, ADDITIONAL_ARG
cmp ALLOC_PTR, TMP
b.lo L(caml_call_gc)
ret
CFI_ENDPROC
END_FUNCTION(caml_allocN)
/* Call a C function from OCaml */
/* Function to call is in ADDITIONAL_ARG */
.macro RET_FROM_C_CALL
ldr TMP, Caml_state(action_pending)
cbnz TMP, 1f
ret
1: mov TMP, #-1
str TMP, Caml_state(young_limit)
ret
.endm
FUNCTION(caml_c_call)
CFI_STARTPROC
CFI_SIGNAL_FRAME
ENTER_FUNCTION
TSAN_SAVE_CALLER_REGS
TSAN_ENTER_FUNCTION
TSAN_RESTORE_CALLER_REGS
/* Switch from OCaml to C */
SWITCH_OCAML_TO_C
/* Make the exception handler alloc ptr available to the C code */
str ALLOC_PTR, Caml_state(young_ptr)
str TRAP_PTR, Caml_state(exn_handler)
/* Call the function */
blr ADDITIONAL_ARG
/* Reload new allocation pointer & exn handler */
ldr ALLOC_PTR, Caml_state(young_ptr)
ldr TRAP_PTR, Caml_state(exn_handler)
/* Load ocaml stack */
SWITCH_C_TO_OCAML
#if defined(WITH_THREAD_SANITIZER)
/* Save return value registers. Since the called function could be
anything, it may have returned its result (if any) either in x0
or d0:d1. */
stp x0, x1, [sp, -16]!
CFI_ADJUST(16)
stp d0, d1, [sp, -16]!
CFI_ADJUST(16)
TSAN_EXIT_FUNCTION
/* Restore return value registers */
ldp d0, d1, [sp], 16
CFI_ADJUST(-16)
ldp x0, x1, [sp], 16
CFI_ADJUST(-16)
#endif
/* Return */
LEAVE_FUNCTION
RET_FROM_C_CALL
CFI_ENDPROC
END_FUNCTION(caml_c_call)
FUNCTION(caml_c_call_stack_args)
CFI_STARTPROC
CFI_SIGNAL_FRAME
/* Arguments:
C arguments : x0 to x7, d0 to d7
C function : ADDITIONAL_ARG
C stack args : begin=STACK_ARG_BEGIN
end=STACK_ARG_END */
ENTER_FUNCTION
/* Switch from OCaml to C */
SWITCH_OCAML_TO_C
/* Make the exception handler alloc ptr available to the C code */
str ALLOC_PTR, Caml_state(young_ptr)
str TRAP_PTR, Caml_state(exn_handler)
/* Copy the arguments and call */
bl G(caml_c_call_copy_stack_args)
/* Reload new allocation pointer & exn handler */
ldr ALLOC_PTR, Caml_state(young_ptr)
ldr TRAP_PTR, Caml_state(exn_handler)
/* Switch from C to OCaml */
SWITCH_C_TO_OCAML
/* Return */
LEAVE_FUNCTION
RET_FROM_C_CALL
CFI_ENDPROC
END_FUNCTION(caml_c_call_stack_args)
/* To correctly maintain frame pointers during stack reallocation,
the runtime assumes that the caml_c_call stub does not push
anything to the stack before the first frame pointer on the C stack.
To guarantee this when stack arguments are used, the actual pushing
of arguments is done by this separate function */
FUNCTION(caml_c_call_copy_stack_args)
CFI_STARTPROC
ENTER_FUNCTION
CFI_DEF_CFA_REGISTER(DW_REG_x29)
/* Copy arguments from OCaml to C stack
NB: STACK_ARG_{BEGIN,END} are 16-byte aligned */
1: sub STACK_ARG_END, STACK_ARG_END, 16
cmp STACK_ARG_END, STACK_ARG_BEGIN
b.lo 2f
ldp TMP, TMP2, [STACK_ARG_END]
stp TMP, TMP2, [sp, -16]!
b 1b
2: /* Call the function */
blr ADDITIONAL_ARG
/* Restore stack */
mov sp, x29
CFI_DEF_CFA_REGISTER(DW_REG_sp)
LEAVE_FUNCTION
ret
CFI_ENDPROC
END_FUNCTION(caml_c_call_copy_stack_args)
/* Start the OCaml program */
FUNCTION(caml_start_program)
CFI_STARTPROC
CFI_SIGNAL_FRAME
#if defined(WITH_THREAD_SANITIZER)
str x0, [sp, -16]!
CFI_ADJUST(16)
/* We can't use the TSAN_ENTER_FUNCTION macro, as it assumes to run on an
OCaml stack, and we are still on a C stack at this point. */
mov x0, x30 /* arg1: return address in caller */
TSAN_SETUP_C_CALL
bl G(__tsan_func_entry)
TSAN_CLEANUP_AFTER_C_CALL
ldr x0, [sp], 16
CFI_ADJUST(-16)
#endif
/* domain state is passed as arg from C */
mov TMP, C_ARG_1
/* Initial entry point is caml_program */
ADDRGLOBAL(TMP2, caml_program)
/* Code shared with caml_callback* */
/* Address of domain state is in TMP */
/* Address of OCaml code to call is in TMP2 */
/* Arguments to the OCaml code are in x0...x7 */
L(jump_to_caml):
/* Set up stack frame and save callee-save registers */
CFI_OFFSET(29, -160)
CFI_OFFSET(30, -152)
stp x29, x30, [sp, -160]!
CFI_ADJUST(160)
mov x29, sp
stp x19, x20, [sp, 16]
stp x21, x22, [sp, 32]
stp x23, x24, [sp, 48]
stp x25, x26, [sp, 64]
stp x27, x28, [sp, 80]
stp d8, d9, [sp, 96]
stp d10, d11, [sp, 112]
stp d12, d13, [sp, 128]
stp d14, d15, [sp, 144]
/* Load domain state pointer from argument */
mov DOMAIN_STATE_PTR, TMP
/* Reload allocation pointer */
ldr ALLOC_PTR, Caml_state(young_ptr)
/* Build (16-byte aligned) struct c_stack_link on the C stack */
ldr x8, Caml_state(c_stack)
stp x8, xzr, [sp, -16]! /* C_stack_prev, pad */
CFI_ADJUST(16)
stp xzr, xzr, [sp, -16]! /* C_stack_stack, C_stack_sp */
CFI_ADJUST(16)
mov x8, sp
str x8, Caml_state(c_stack)
/* Load the OCaml stack */
ldr x8, Caml_state(current_stack)
ldr x8, Stack_sp(x8)
/* Store the gc_regs for callbacks during a GC */
ldr x9, Caml_state(gc_regs)
str x9, [x8, -8]!
/* Store the stack pointer to allow DWARF unwind */
mov x9, sp
str x9, [x8, -8]! /* C_stack_sp */
/* Setup a trap frame to catch exceptions escaping the OCaml code */
ldr x9, Caml_state(exn_handler)
adr x10, L(trap_handler)
stp x9, x10, [x8, -16]!
mov TRAP_PTR, x8
/* Switch stacks and call the OCaml code */
mov sp, x8
#ifdef ASM_CFI_SUPPORTED
CFI_REMEMBER_STATE
.cfi_escape DW_CFA_def_cfa_expression, 3 + 2 + 2, \
/* sp points to the exn handler on the OCaml stack */ \
/* sp + 16 contains the C_STACK_SP */ \
DW_OP_breg + DW_REG_sp, 16 /* exn handler */, DW_OP_deref, \
/* 32 struct c_stack_link + pad */ \
/* 18*8 callee save regs */ \
/* 16 fp + ret addr */ \
/* need to split to get under 127 limit */ \
DW_OP_plus_uconst, 96, DW_OP_plus_uconst, 96
#endif
/* Call the OCaml code */
blr TMP2
L(caml_retaddr):
/* Pop the trap frame, restoring Caml_state->exn_handler */
ldr x8, [sp], 16
CFI_ADJUST(-16)
str x8, Caml_state(exn_handler)
L(return_result):
/* restore GC regs */
ldp x8, x9, [sp], 16
CFI_ADJUST(-16)
str x9, Caml_state(gc_regs)
/* Update allocation pointer */
str ALLOC_PTR, Caml_state(young_ptr)
/* Return to C stack */
ldr x8, Caml_state(current_stack)
mov x9, sp
str x9, Stack_sp(x8)
ldr x9, Caml_state(c_stack)
mov sp, x9
CFI_RESTORE_STATE
/* Pop the struct c_stack_link */
ldr x8, Cstack_prev(sp)
add sp, sp, 32
CFI_ADJUST(-32)
str x8, Caml_state(c_stack)
#if defined(WITH_THREAD_SANITIZER)
/* We can't use the TSAN_EXIT_FUNCTION macro, as it assumes to run on an
OCaml stack, and we are back to a C stack at this point. */
str x0, [sp, -16]!
CFI_ADJUST(16)
mov x0, xzr
TSAN_SETUP_C_CALL
bl G(__tsan_func_exit)
TSAN_CLEANUP_AFTER_C_CALL
ldr x0, [sp], 16
CFI_ADJUST(-16)
#endif
/* Reload callee-save registers and return address */
ldp x19, x20, [sp, 16]
ldp x21, x22, [sp, 32]
ldp x23, x24, [sp, 48]
ldp x25, x26, [sp, 64]
ldp x27, x28, [sp, 80]
ldp d8, d9, [sp, 96]
ldp d10, d11, [sp, 112]
ldp d12, d13, [sp, 128]
ldp d14, d15, [sp, 144]
ldp x29, x30, [sp], 160
CFI_ADJUST(-160)
/* Return to C caller */
ret
CFI_ENDPROC
END_FUNCTION(caml_start_program)
/* The trap handler */
.align 2
L(trap_handler):
CFI_STARTPROC
/* Save exception pointer */
str TRAP_PTR, Caml_state(exn_handler)
/* Encode exception bucket as an exception result */
orr x0, x0, #2
/* Return it */
b L(return_result)
CFI_ENDPROC
/* Exceptions */
.macro JUMP_TO_TRAP_PTR
/* Cut stack at current trap handler */
mov sp, TRAP_PTR
/* Pop previous handler and jump to it */
ldp TRAP_PTR, TMP, [sp], 16
br TMP
.endm
/* Raise an exception from OCaml */
FUNCTION(caml_raise_exn)
CFI_STARTPROC
/* Test if backtrace is active */
ldr TMP, Caml_state(backtrace_active)
cbnz TMP, 2f
1:
JUMP_TO_TRAP_PTR
2: /* Zero backtrace_pos */
str xzr, Caml_state(backtrace_pos)
L(caml_reraise_exn_stash):
/* Preserve exception bucket in callee-save register x19 */
mov x19, x0
/* Stash the backtrace */
/* arg1: exn bucket, already in x0 */
mov x1, x30 /* arg2: pc of raise */
mov x2, sp /* arg3: sp of raise */
mov x3, TRAP_PTR /* arg4: sp of handler */
/* Switch to C stack */
ldr TMP, Caml_state(c_stack)
mov sp, TMP
bl G(caml_stash_backtrace)
/* Restore exception bucket and raise */
mov x0, x19
b 1b
CFI_ENDPROC
END_FUNCTION(caml_raise_exn)
FUNCTION(caml_reraise_exn)
CFI_STARTPROC
ldr TMP, Caml_state(backtrace_active)
cbnz TMP, L(caml_reraise_exn_stash)
JUMP_TO_TRAP_PTR
CFI_ENDPROC
END_FUNCTION(caml_reraise_exn)
#if defined(WITH_THREAD_SANITIZER)
/* When TSan support is enabled, this routine should be called just before
raising an exception. It calls __tsan_func_exit for every OCaml frame about
to be exited due to the exception.
Takes no arguments, clobbers x0, x1, x2 and potentially all
caller-saved registers of the C calling convention. */
FUNCTION(caml_tsan_exit_on_raise_asm)
CFI_STARTPROC
mov x0, x30 /* arg1: pc of raise */
mov x1, sp /* arg2: sp of raise */
mov x2, TRAP_PTR /* arg3: sp of handler */
TSAN_C_CALL G(caml_tsan_exit_on_raise)
ret
CFI_ENDPROC
END_FUNCTION(caml_tsan_exit_on_raise_asm)
#endif
/* Raise an exception from C */
FUNCTION(caml_raise_exception)
CFI_STARTPROC
/* Load the domain state ptr */
mov DOMAIN_STATE_PTR, C_ARG_1
/* Load the exception bucket */
mov x0, C_ARG_2
/* Reload trap ptr and alloc ptr */
ldr TRAP_PTR, Caml_state(exn_handler)
ldr ALLOC_PTR, Caml_state(young_ptr)
/* Discard the C stack pointer and reset to ocaml stack */
ldr TMP, Caml_state(current_stack)
ldr TMP, Stack_sp(TMP)
mov sp, TMP
#if defined(WITH_THREAD_SANITIZER)
str x0, [sp, -16]! /* preserve exception bucket */
CFI_ADJUST(16)
/* Call __tsan_func_exit for every OCaml stack frame exited due to the
exception */
mov x1, TMP
ldr x0, [x1, 8] /* arg1: pc of raise */
/* This stack address adjustment is required to compensate the
saving of x29 and x30 in SWITCH_OCAML_STACKS, which causes
Stack_sp() to be 16 bytes lower than expected. */
add x1, x1, 16 /* arg2: sp of raise */
mov x2, TRAP_PTR /* arg3: sp of handler */
TSAN_C_CALL G(caml_tsan_exit_on_raise)
ldr x0, [sp], 16
CFI_ADJUST(-16)
#endif
/* Restore frame and link on return to OCaml */
LEAVE_FUNCTION
b G(caml_raise_exn)
CFI_ENDPROC
END_FUNCTION(caml_raise_exception)
/* Callback from C to OCaml */
FUNCTION(caml_callback_asm)
CFI_STARTPROC
#if defined(WITH_THREAD_SANITIZER)
/* Save non-callee-saved registers x0, x1, x2 and x30 before C call */
stp x0, x1, [sp, -16]!
CFI_ADJUST(16)
stp x2, x30, [sp, -16]!
CFI_ADJUST(16)
mov x0, x30 /* return address */
bl G(__tsan_func_entry)
ldp x2, x30, [sp], 16
CFI_ADJUST(-16)
ldp x0, x1, [sp], 16
CFI_ADJUST(-16)
#endif
/* Initial shuffling of arguments */
/* (x0 = Caml_state, x1 = closure, [x2] = first arg) */
mov TMP, x0
ldr x0, [x2] /* x0 = first arg */
/* x1 = closure environment */
ldr TMP2, [x1] /* code pointer */
b L(jump_to_caml)
CFI_ENDPROC
END_FUNCTION(caml_callback_asm)
FUNCTION(caml_callback2_asm)
CFI_STARTPROC
#if defined(WITH_THREAD_SANITIZER)
/* Save non-callee-saved registers x0, x1, x2 and x30 before C call */
stp x0, x1, [sp, -16]!
CFI_ADJUST(16)
stp x2, x30, [sp, -16]!
CFI_ADJUST(16)
mov x0, x30 /* return address */
bl G(__tsan_func_entry)
ldp x2, x30, [sp], 16
CFI_ADJUST(-16)
ldp x0, x1, [sp], 16
CFI_ADJUST(-16)
#endif
/* Initial shuffling of arguments */
/* (x0 = Caml_state, x1 = closure, [x2] = arg1, [x2,8] = arg2) */
mov TMP, x0
mov TMP2, x1
ldp x0, x1, [x2, 0] /* x0 = first arg, x1 = second arg */
mov x2, TMP2 /* x2 = closure environment */
ADDRGLOBAL(TMP2, caml_apply2)
b L(jump_to_caml)
CFI_ENDPROC
END_FUNCTION(caml_callback2_asm)
FUNCTION(caml_callback3_asm)
CFI_STARTPROC
#if defined(WITH_THREAD_SANITIZER)
/* Save non-callee-saved registers x0, x1, x2 and x30 before C call */
stp x0, x1, [sp, -16]!
CFI_ADJUST(16)
stp x2, x30, [sp, -16]!
CFI_ADJUST(16)
mov x0, x30 /* return address */
bl G(__tsan_func_entry)
ldp x2, x30, [sp], 16
CFI_ADJUST(-16)
ldp x0, x1, [sp], 16
CFI_ADJUST(-16)
#endif
/* Initial shuffling of arguments */
/* (x0 = Caml_state, x1 = closure, [x2] = arg1, [x2,8] = arg2,
[x2,16] = arg3) */
mov TMP, x0
mov x3, x1 /* x3 = closure environment */
ldp x0, x1, [x2, 0] /* x0 = first arg, x1 = second arg */
ldr x2, [x2, 16] /* x2 = third arg */
ADDRGLOBAL(TMP2, caml_apply3)
b L(jump_to_caml)
CFI_ENDPROC
END_FUNCTION(caml_callback3_asm)
/* Fibers */
/* Switch between OCaml stacks. Clobbers TMP and switches TRAP_PTR
Preserves old_stack and new_stack registers */
.macro SWITCH_OCAML_STACKS old_stack, new_stack
/* Save frame pointer and return address for old_stack */
ENTER_FUNCTION
/* Save OCaml SP and exn_handler in the stack info */
mov TMP, sp
str TMP, Stack_sp(\old_stack)
str TRAP_PTR, Stack_exception(\old_stack)
/* switch stacks */
str \new_stack, Caml_state(current_stack)
ldr TMP, Stack_sp(\new_stack)
mov sp, TMP
/* restore exn_handler for new stack */
ldr TRAP_PTR, Stack_exception(\new_stack)
/* Restore frame pointer and return address for new_stack */
LEAVE_FUNCTION
.endm
/*
* A continuation is a one word object that points to a fiber. A fiber [f] will
* point to its parent at Handler_parent(Stack_handler(f)). In the following,
* the [last_fiber] refers to the last fiber in the linked-list formed by the
* parent pointer.
*/
FUNCTION(caml_perform)
CFI_STARTPROC
/* x0: effect to perform
x1: freshly allocated continuation */
ldr x2, Caml_state(current_stack) /* x2 := old stack */
add x3, x2, 1 /* x3 := Val_ptr(old stack) */
str x3, [x1] /* Initialize continuation */
L(do_perform):
/* x0: effect to perform
x1: continuation
x2: old_stack
x3: last_fiber */
#if defined(WITH_THREAD_SANITIZER)
/* Signal to TSan all stack frames exited by the perform. */
TSAN_SAVE_CALLER_REGS
mov x0, x30 /* arg 1: pc of perform */
mov x1, sp /* arg 2: sp of perform */
TSAN_C_CALL G(caml_tsan_exit_on_perform)
TSAN_RESTORE_CALLER_REGS
#endif
str x3, [x1, 8] /* Set the last_fiber field in the continuation */
ldr x9, Stack_handler(x2) /* x9 := old stack -> handler */
ldr x10, Handler_parent(x9) /* x10 := parent stack */
cbz x10, 1f
#if defined(WITH_THREAD_SANITIZER)
/* Save non-callee-saved registers x0, x1, x2, x3, x9 and x10 */
stp x0, x1, [sp, -16]!
CFI_ADJUST(16)
stp x2, x3, [sp, -16]!
CFI_ADJUST(16)
stp x9, x10, [sp, -16]!
CFI_ADJUST(16)
/* Match the TSan-enter made from caml_runstack */
TSAN_EXIT_FUNCTION
ldp x9, x10, [sp], 16
CFI_ADJUST(-16)
ldp x2, x3, [sp], 16
CFI_ADJUST(-16)
ldp x0, x1, [sp], 16
CFI_ADJUST(-16)
#endif
SWITCH_OCAML_STACKS x2, x10
/* we have to null the Handler_parent after the switch because
the Handler_parent is needed to unwind the stack for backtraces */
str xzr, Handler_parent(x9) /* Set parent of performer to NULL */
ldr TMP, Handler_effect(x9)
mov x2, x3 /* x2 := last_fiber */
mov x3, TMP /* x3 := effect handler */
b G(caml_apply3)
1:
/* switch back to original performer before raising Effect.Unhandled
(no-op unless this is a reperform) */
ldr x10, [x1] /* load performer stack from continuation */
sub x10, x10, 1 /* x10 := Ptr_val(x10) */
ldr x9, Caml_state(current_stack)
SWITCH_OCAML_STACKS x9, x10
/* No parent stack. Raise Effect.Unhandled. */
#if defined(WITH_THREAD_SANITIZER)
/* We must let the TSan runtime know that we switched back to the
original performer stack. For that, we perform the necessary calls
to __tsan_func_entry via caml_tsan_entry_on_resume.
Note that from TSan's point of view, we just exited all stack
frames, including those of the main fiber. This is ok, because we
re-enter them immediately via caml_tsan_entry_on_resume below. */
TSAN_SAVE_CALLER_REGS
mov x0, x30 /* arg 1: pc of perform */
mov x1, sp /* arg 2: sp of perform */
mov x2, x10 /* arg 3: performer stack */
TSAN_C_CALL G(caml_tsan_entry_on_resume)
TSAN_RESTORE_CALLER_REGS
#endif
ADDRGLOBAL(ADDITIONAL_ARG, caml_raise_unhandled_effect)
b G(caml_c_call)
CFI_ENDPROC
END_FUNCTION(caml_perform)
FUNCTION(caml_reperform)
CFI_STARTPROC
/* x0: effect to reperform
x1: continuation
x2: last_fiber */
ldr TMP, Stack_handler_from_cont(x2)
ldr x2, Caml_state(current_stack) /* x2 := old stack */
str x2, Handler_parent(TMP) /* Append to last_fiber */
add x3, x2, 1 /* x3 (last_fiber) := Val_ptr(old stack) */
/* Need to update the oldest saved frame pointer here as the execution of
the handler may have caused the current fiber stack to reallocate. */
UPDATE_BASE_POINTER TMP
b L(do_perform)
CFI_ENDPROC
END_FUNCTION(caml_reperform)
FUNCTION(caml_resume)
CFI_STARTPROC
/* x0: new fiber
x1: fun
x2: arg
x3: last_fiber */
sub x0, x0, 1 /* x0 = Ptr_val(x0) */
ldr x4, [x1] /* code pointer */
/* Check if stack null, then already used */
cbz x0, 1f
#if defined(WITH_THREAD_SANITIZER)
/* Save non-callee-saved registers x0, x1, x2, x3 and x4 */
stp x0, x1, [sp, -16]!
CFI_ADJUST(16)
stp x2, x3, [sp, -16]!
CFI_ADJUST(16)
str x4, [sp, -16]!
CFI_ADJUST(16)
/* Necessary to include the caller of caml_resume in the TSan backtrace */
TSAN_ENTER_FUNCTION
ldr x4, [sp], 16
CFI_ADJUST(-16)
ldp x2, x3, [sp], 16
CFI_ADJUST(-16)
ldp x0, x1, [sp], 16
CFI_ADJUST(-16)
TSAN_SAVE_CALLER_REGS
/* Signal to TSan all stack frames exited by the perform. */
mov x2, x0 /* arg 3: fiber */
ldr x1, Stack_sp(x0)
ldr x0, [x1, 8] /* arg 1: pc of perform */
/* This stack address adjustment is required to compensate the
saving of x29 and x30 in SWITCH_OCAML_STACKS, which causes
Stack_sp() to be 16 bytes lower than expected. */
add x1, x1, 16 /* arg 2: sp at perform */
TSAN_C_CALL G(caml_tsan_entry_on_resume)
TSAN_RESTORE_CALLER_REGS
#endif
/* Add current stack to the end */
sub x3, x3, 1 /* x3 = Ptr_val(x3) */
ldr x8, Stack_handler(x3)
ldr x9, Caml_state(current_stack)
str x9, Handler_parent(x8)
/* Need to update the oldest saved frame pointer here as the current fiber
stack may have been reallocated or we may be resuming a computation
that was not originally run here. */
UPDATE_BASE_POINTER x8
SWITCH_OCAML_STACKS x9, x0
mov x0, x2
br x4
1: ADDRGLOBAL(ADDITIONAL_ARG, caml_raise_continuation_already_resumed)
b G(caml_c_call)
CFI_ENDPROC
END_FUNCTION(caml_resume)
/* Run a function on a new stack, then either
return the value or invoke exception handler */
FUNCTION(caml_runstack)
CFI_STARTPROC
#if defined(WITH_THREAD_SANITIZER)
/* Save non-callee-saved registers x0, x1 and x2 */
stp x0, x1, [sp, -16]!
CFI_ADJUST(16)
str x2, [sp, -16]!
CFI_ADJUST(16)
/* Necessary to include the caller of caml_runstack in the TSan backtrace */
TSAN_ENTER_FUNCTION
ldr x2, [sp], 16
CFI_ADJUST(-16)
ldp x0, x1, [sp], 16
CFI_ADJUST(-16)
#endif
CFI_SIGNAL_FRAME
/* x0: fiber
x1: fun
x2: arg */
ENTER_FUNCTION
sub x0, x0, 1 /* x0 := Ptr_val(x0) */
ldr x3, [x1] /* code pointer */
/* save old stack pointer and exception handler */
ldr x8, Caml_state(current_stack) /* x8 := old stack */
mov TMP, sp
str TMP, Stack_sp(x8)
str TRAP_PTR, Stack_exception(x8)
/* Load new stack pointer and set parent */
ldr TMP, Stack_handler(x0)
str x8, Handler_parent(TMP)
str x0, Caml_state(current_stack)
ldr x9, Stack_sp(x0) /* x9 := sp of new stack */
/* Create an exception handler on the target stack
after 16byte DWARF & gc_regs block (which is unused here) */
sub x9, x9, 32
adr TMP, L(fiber_exn_handler)
str TMP, [x9, 8]
/* link the previous exn_handler so that copying stacks works */
ldr TMP, Stack_exception(x0)
str TMP, [x9]
mov TRAP_PTR, x9
/* Switch to the new stack */
mov sp, x9
#ifdef ASM_CFI_SUPPORTED
CFI_REMEMBER_STATE
.cfi_escape DW_CFA_def_cfa_expression, 3+3+2, \
DW_OP_breg + DW_REG_sp, \
16 /* exn */ + \
8 /* gc_regs slot (unused) */ + \
8 /* C_STACK_SP for DWARF (unused) */ \
+ Handler_parent_offset, DW_OP_deref, \
DW_OP_plus_uconst, Stack_sp_offset, DW_OP_deref, \
DW_OP_plus_uconst, 16 /* fp + ret addr */
#endif
/* Call the function on the new stack */
mov x0, x2
blr x3
L(frame_runstack):
add x8, sp, 32 /* x8 := stack_handler */
ldr x19, Handler_value(x8) /* saved across C call */
1:
mov x20, x0 /* save return across C call */
ldr x0, Caml_state(current_stack) /* arg to caml_free_stack */
/* restore parent stack and exn_handler into Caml_state */
ldr TMP, Handler_parent(x8)
str TMP, Caml_state(current_stack)
ldr TRAP_PTR, Stack_exception(TMP)
str TRAP_PTR, Caml_state(exn_handler)
/* free old stack by switching directly to c_stack;
is a no-alloc call */
ldr x21, Stack_sp(TMP) /* saved across C call */
CFI_RESTORE_STATE
CFI_REMEMBER_STATE
CFI_DEF_CFA_REGISTER(DW_REG_x21)
ldr TMP, Caml_state(c_stack)
mov sp, TMP
bl G(caml_free_stack)
/* switch directly to parent stack */
mov sp, x21
CFI_RESTORE_STATE
/* Signal to TSan that we exit caml_runstack (no registers to save here) */
TSAN_EXIT_FUNCTION
/* pick correct return value */
mov x0, x20
mov x1, x19
ldr TMP, [x19] /* code pointer */
/* Invoke handle_value (or handle_exn) */
LEAVE_FUNCTION
br TMP
L(fiber_exn_handler):
add x8, sp, 16 /* x8 := stack_handler */
ldr x19, Handler_exception(x8)
b 1b
CFI_ENDPROC
END_FUNCTION(caml_runstack)
FUNCTION(caml_ml_array_bound_error)
CFI_STARTPROC
/* Load address of [caml_array_bound_error_asm] in ADDITIONAL_ARG */
ADDRGLOBAL(ADDITIONAL_ARG, caml_array_bound_error_asm)
/* Call that function */
b G(caml_c_call)
CFI_ENDPROC
END_FUNCTION(caml_ml_array_bound_error)
TEXT_SECTION(caml_system__code_end)
.globl G(caml_system__code_end)
G(caml_system__code_end):
/* Frametable - GC roots for callback */
/* Uses the same naming convention as ocamlopt generated modules. */
OBJECT(CAMLSEP(system,frametable))
.quad 2 /* two descriptors */
.quad L(caml_retaddr) /* return address into callback */
.short -1 /* negative frame size => use callback link */
.short 0 /* no roots */
.align 3
.quad L(frame_runstack) /* return address into fiber handler */
.short -1 /* negative frame size => use callback link */
.short 0 /* no roots here */
.align 3
END_OBJECT(CAMLSEP(system,frametable))
NONEXECSTACK_NOTE
|